0s autopkgtest [02:29:32]: starting date and time: 2025-01-18 02:29:32+0000 0s autopkgtest [02:29:32]: git checkout: 325255d2 Merge branch 'pin-any-arch' into 'ubuntu/production' 0s autopkgtest [02:29:32]: host juju-7f2275-prod-proposed-migration-environment-15; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.y30ecfqu/out --timeout-copy=6000 --setup-commands /home/ubuntu/autopkgtest-cloud/worker-config-production/setup-canonical.sh --apt-pocket=proposed=src:python-urllib3 --apt-upgrade vcr.py --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 --env=ADT_TEST_TRIGGERS=python-urllib3/2.3.0-1 -- ssh -s /home/ubuntu/autopkgtest/ssh-setup/nova -- --flavor autopkgtest-s390x --security-groups autopkgtest-juju-7f2275-prod-proposed-migration-environment-15@bos03-s390x-16.secgroup --name adt-plucky-s390x-vcr.py-20250118-022932-juju-7f2275-prod-proposed-migration-environment-15-cbba265f-67fa-419e-b994-77d1a78e18a7 --image adt/ubuntu-plucky-s390x-server --keyname testbed-juju-7f2275-prod-proposed-migration-environment-15 --net-id=net_prod-proposed-migration-s390x -e TERM=linux -e ''"'"'http_proxy=http://squid.internal:3128'"'"'' -e ''"'"'https_proxy=http://squid.internal:3128'"'"'' -e ''"'"'no_proxy=127.0.0.1,127.0.1.1,login.ubuntu.com,localhost,localdomain,novalocal,internal,archive.ubuntu.com,ports.ubuntu.com,security.ubuntu.com,ddebs.ubuntu.com,changelogs.ubuntu.com,keyserver.ubuntu.com,launchpadlibrarian.net,launchpadcontent.net,launchpad.net,10.24.0.0/24,keystone.ps5.canonical.com,objectstorage.prodstack5.canonical.com,radosgw.ps5.canonical.com'"'"'' --mirror=http://ftpmaster.internal/ubuntu/ 93s autopkgtest [02:31:05]: testbed dpkg architecture: s390x 93s autopkgtest [02:31:05]: testbed apt version: 2.9.18 93s autopkgtest [02:31:05]: @@@@@@@@@@@@@@@@@@@@ test bed setup 94s autopkgtest [02:31:06]: testbed release detected to be: None 94s autopkgtest [02:31:06]: updating testbed package index (apt update) 95s Get:1 http://ftpmaster.internal/ubuntu plucky-proposed InRelease [73.9 kB] 95s Hit:2 http://ftpmaster.internal/ubuntu plucky InRelease 95s Hit:3 http://ftpmaster.internal/ubuntu plucky-updates InRelease 95s Hit:4 http://ftpmaster.internal/ubuntu plucky-security InRelease 95s Get:5 http://ftpmaster.internal/ubuntu plucky-proposed/main Sources [156 kB] 95s Get:6 http://ftpmaster.internal/ubuntu plucky-proposed/universe Sources [838 kB] 95s Get:7 http://ftpmaster.internal/ubuntu plucky-proposed/restricted Sources [9708 B] 95s Get:8 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse Sources [15.3 kB] 95s Get:9 http://ftpmaster.internal/ubuntu plucky-proposed/main s390x Packages [265 kB] 95s Get:10 http://ftpmaster.internal/ubuntu plucky-proposed/restricted s390x Packages [756 B] 95s Get:11 http://ftpmaster.internal/ubuntu plucky-proposed/universe s390x Packages [922 kB] 95s Get:12 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse s390x Packages [4972 B] 95s Fetched 2286 kB in 1s (2395 kB/s) 96s Reading package lists... 97s Reading package lists... 97s Building dependency tree... 97s Reading state information... 97s Calculating upgrade... 97s The following packages will be upgraded: 97s gcc-14-base libatomic1 libgcc-s1 libstdc++6 python3-certifi python3-chardet 97s python3-jwt rng-tools-debian usb.ids 97s 9 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 97s Need to get 1408 kB of archives. 97s After this operation, 1024 B disk space will be freed. 97s Get:1 http://ftpmaster.internal/ubuntu plucky/universe s390x rng-tools-debian s390x 2.6 [44.6 kB] 97s Get:2 http://ftpmaster.internal/ubuntu plucky/main s390x libatomic1 s390x 14.2.0-13ubuntu1 [9422 B] 97s Get:3 http://ftpmaster.internal/ubuntu plucky/main s390x gcc-14-base s390x 14.2.0-13ubuntu1 [53.0 kB] 97s Get:4 http://ftpmaster.internal/ubuntu plucky/main s390x libstdc++6 s390x 14.2.0-13ubuntu1 [896 kB] 98s Get:5 http://ftpmaster.internal/ubuntu plucky/main s390x libgcc-s1 s390x 14.2.0-13ubuntu1 [35.9 kB] 98s Get:6 http://ftpmaster.internal/ubuntu plucky/main s390x usb.ids all 2025.01.14-1 [223 kB] 98s Get:7 http://ftpmaster.internal/ubuntu plucky/main s390x python3-certifi all 2024.12.14+ds-1 [9800 B] 98s Get:8 http://ftpmaster.internal/ubuntu plucky/main s390x python3-chardet all 5.2.0+dfsg-2 [116 kB] 98s Get:9 http://ftpmaster.internal/ubuntu plucky/main s390x python3-jwt all 2.10.1-2 [21.0 kB] 98s Fetched 1408 kB in 1s (2322 kB/s) 98s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 55658 files and directories currently installed.) 98s Preparing to unpack .../rng-tools-debian_2.6_s390x.deb ... 98s Unpacking rng-tools-debian (2.6) over (2.5) ... 98s Preparing to unpack .../libatomic1_14.2.0-13ubuntu1_s390x.deb ... 98s Unpacking libatomic1:s390x (14.2.0-13ubuntu1) over (14.2.0-12ubuntu1) ... 98s Preparing to unpack .../gcc-14-base_14.2.0-13ubuntu1_s390x.deb ... 98s Unpacking gcc-14-base:s390x (14.2.0-13ubuntu1) over (14.2.0-12ubuntu1) ... 98s Setting up gcc-14-base:s390x (14.2.0-13ubuntu1) ... 98s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 55658 files and directories currently installed.) 98s Preparing to unpack .../libstdc++6_14.2.0-13ubuntu1_s390x.deb ... 98s Unpacking libstdc++6:s390x (14.2.0-13ubuntu1) over (14.2.0-12ubuntu1) ... 98s Setting up libstdc++6:s390x (14.2.0-13ubuntu1) ... 98s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 55658 files and directories currently installed.) 98s Preparing to unpack .../libgcc-s1_14.2.0-13ubuntu1_s390x.deb ... 98s Unpacking libgcc-s1:s390x (14.2.0-13ubuntu1) over (14.2.0-12ubuntu1) ... 98s Setting up libgcc-s1:s390x (14.2.0-13ubuntu1) ... 98s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 55658 files and directories currently installed.) 98s Preparing to unpack .../usb.ids_2025.01.14-1_all.deb ... 98s Unpacking usb.ids (2025.01.14-1) over (2024.12.04-1) ... 98s Preparing to unpack .../python3-certifi_2024.12.14+ds-1_all.deb ... 98s Unpacking python3-certifi (2024.12.14+ds-1) over (2024.8.30+dfsg-1) ... 98s Preparing to unpack .../python3-chardet_5.2.0+dfsg-2_all.deb ... 98s Unpacking python3-chardet (5.2.0+dfsg-2) over (5.2.0+dfsg-1) ... 98s Preparing to unpack .../python3-jwt_2.10.1-2_all.deb ... 98s Unpacking python3-jwt (2.10.1-2) over (2.7.0-1) ... 98s Setting up python3-jwt (2.10.1-2) ... 98s Setting up python3-chardet (5.2.0+dfsg-2) ... 99s Setting up python3-certifi (2024.12.14+ds-1) ... 99s Setting up rng-tools-debian (2.6) ... 99s Setting up libatomic1:s390x (14.2.0-13ubuntu1) ... 99s Setting up usb.ids (2025.01.14-1) ... 99s Processing triggers for man-db (2.13.0-1) ... 100s Processing triggers for libc-bin (2.40-4ubuntu1) ... 101s Reading package lists... 101s Building dependency tree... 101s Reading state information... 101s 0 upgraded, 0 newly installed, 0 to remove and 1 not upgraded. 101s autopkgtest [02:31:13]: upgrading testbed (apt dist-upgrade and autopurge) 101s Reading package lists... 101s Building dependency tree... 101s Reading state information... 101s Calculating upgrade...Starting pkgProblemResolver with broken count: 0 101s Starting 2 pkgProblemResolver with broken count: 0 101s Done 101s Entering ResolveByKeep 102s 102s The following packages will be upgraded: 102s python3-urllib3 102s 1 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 102s Need to get 94.0 kB of archives. 102s After this operation, 18.4 kB of additional disk space will be used. 102s Get:1 http://ftpmaster.internal/ubuntu plucky-proposed/main s390x python3-urllib3 all 2.3.0-1 [94.0 kB] 102s Fetched 94.0 kB in 0s (296 kB/s) 102s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 55655 files and directories currently installed.) 102s Preparing to unpack .../python3-urllib3_2.3.0-1_all.deb ... 102s Unpacking python3-urllib3 (2.3.0-1) over (2.0.7-2ubuntu0.1) ... 102s Setting up python3-urllib3 (2.3.0-1) ... 103s Reading package lists... 103s Building dependency tree... 103s Reading state information... 103s Starting pkgProblemResolver with broken count: 0 103s Starting 2 pkgProblemResolver with broken count: 0 103s Done 103s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 103s autopkgtest [02:31:15]: rebooting testbed after setup commands that affected boot 117s autopkgtest-virt-ssh: WARNING: ssh connection failed. Retrying in 3 seconds... 123s autopkgtest [02:31:35]: testbed running kernel: Linux 6.11.0-8-generic #8-Ubuntu SMP Mon Sep 16 12:49:35 UTC 2024 125s autopkgtest [02:31:37]: @@@@@@@@@@@@@@@@@@@@ apt-source vcr.py 128s Get:1 http://ftpmaster.internal/ubuntu plucky/universe vcr.py 6.0.2-2 (dsc) [2977 B] 128s Get:2 http://ftpmaster.internal/ubuntu plucky/universe vcr.py 6.0.2-2 (tar) [339 kB] 128s Get:3 http://ftpmaster.internal/ubuntu plucky/universe vcr.py 6.0.2-2 (diff) [6348 B] 128s gpgv: Signature made Tue Dec 17 14:55:48 2024 UTC 128s gpgv: using RSA key AC0A4FF12611B6FCCF01C111393587D97D86500B 128s gpgv: Can't check signature: No public key 128s dpkg-source: warning: cannot verify inline signature for ./vcr.py_6.0.2-2.dsc: no acceptable signature found 128s autopkgtest [02:31:40]: testing package vcr.py version 6.0.2-2 128s autopkgtest [02:31:40]: build not needed 129s autopkgtest [02:31:41]: test pybuild-autopkgtest: preparing testbed 129s Reading package lists... 129s Building dependency tree... 129s Reading state information... 129s Starting pkgProblemResolver with broken count: 0 129s Starting 2 pkgProblemResolver with broken count: 0 129s Done 129s The following NEW packages will be installed: 129s autoconf automake autopoint autotools-dev build-essential cpp cpp-14 129s cpp-14-s390x-linux-gnu cpp-s390x-linux-gnu debhelper debugedit dh-autoreconf 129s dh-python dh-strip-nondeterminism docutils-common dwz fonts-font-awesome 129s fonts-lato g++ g++-14 g++-14-s390x-linux-gnu g++-s390x-linux-gnu gcc gcc-14 129s gcc-14-s390x-linux-gnu gcc-s390x-linux-gnu gettext intltool-debian 129s libarchive-zip-perl libasan8 libcc1-0 libdebhelper-perl 129s libfile-stripnondeterminism-perl libgcc-14-dev libgomp1 libisl23 libitm1 129s libjs-jquery libjs-sphinxdoc libjs-underscore libjson-perl liblua5.4-0 129s libmpc3 libpython3.13-minimal libpython3.13-stdlib libstdc++-14-dev libtool 129s libubsan1 m4 pandoc pandoc-data po-debconf pybuild-plugin-autopkgtest 129s python-vcr-doc python3-aiohappyeyeballs python3-aiohttp python3-aiosignal 129s python3-alabaster python3-all python3-async-timeout python3-boto3 129s python3-botocore python3-brotli python3-brotlicffi python3-click 129s python3-dateutil python3-decorator python3-defusedxml python3-docutils 129s python3-flasgger python3-flask python3-frozenlist python3-greenlet 129s python3-httpbin python3-imagesize python3-iniconfig python3-itsdangerous 129s python3-jmespath python3-mistune python3-multidict python3-packaging 129s python3-pluggy python3-pytest python3-pytest-httpbin python3-pytest-tornado 129s python3-roman python3-s3transfer python3-six python3-snowballstemmer 129s python3-sphinx python3-sphinx-rtd-theme python3-sphinxcontrib.jquery 129s python3-tornado python3-vcr python3-werkzeug python3-wrapt python3-yarl 129s python3.13 python3.13-minimal sgml-base sphinx-common 129s sphinx-rtd-theme-common xml-core 130s 0 upgraded, 103 newly installed, 0 to remove and 0 not upgraded. 130s Need to get 130 MB of archives. 130s After this operation, 705 MB of additional disk space will be used. 130s Get:1 http://ftpmaster.internal/ubuntu plucky/main s390x fonts-lato all 2.015-1 [2781 kB] 130s Get:2 http://ftpmaster.internal/ubuntu plucky/main s390x libpython3.13-minimal s390x 3.13.1-2 [880 kB] 130s Get:3 http://ftpmaster.internal/ubuntu plucky/main s390x python3.13-minimal s390x 3.13.1-2 [2364 kB] 130s Get:4 http://ftpmaster.internal/ubuntu plucky/main s390x sgml-base all 1.31 [11.4 kB] 130s Get:5 http://ftpmaster.internal/ubuntu plucky/main s390x m4 s390x 1.4.19-4build1 [256 kB] 130s Get:6 http://ftpmaster.internal/ubuntu plucky/main s390x autoconf all 2.72-3 [382 kB] 130s Get:7 http://ftpmaster.internal/ubuntu plucky/main s390x autotools-dev all 20220109.1 [44.9 kB] 130s Get:8 http://ftpmaster.internal/ubuntu plucky/main s390x automake all 1:1.16.5-1.3ubuntu1 [558 kB] 130s Get:9 http://ftpmaster.internal/ubuntu plucky/main s390x autopoint all 0.22.5-3 [616 kB] 130s Get:10 http://ftpmaster.internal/ubuntu plucky/main s390x libisl23 s390x 0.27-1 [704 kB] 130s Get:11 http://ftpmaster.internal/ubuntu plucky/main s390x libmpc3 s390x 1.3.1-1build2 [57.8 kB] 130s Get:12 http://ftpmaster.internal/ubuntu plucky/main s390x cpp-14-s390x-linux-gnu s390x 14.2.0-13ubuntu1 [9570 kB] 131s Get:13 http://ftpmaster.internal/ubuntu plucky/main s390x cpp-14 s390x 14.2.0-13ubuntu1 [1026 B] 131s Get:14 http://ftpmaster.internal/ubuntu plucky/main s390x cpp-s390x-linux-gnu s390x 4:14.1.0-2ubuntu1 [5452 B] 131s Get:15 http://ftpmaster.internal/ubuntu plucky/main s390x cpp s390x 4:14.1.0-2ubuntu1 [22.4 kB] 131s Get:16 http://ftpmaster.internal/ubuntu plucky/main s390x libcc1-0 s390x 14.2.0-13ubuntu1 [50.7 kB] 131s Get:17 http://ftpmaster.internal/ubuntu plucky/main s390x libgomp1 s390x 14.2.0-13ubuntu1 [151 kB] 131s Get:18 http://ftpmaster.internal/ubuntu plucky/main s390x libitm1 s390x 14.2.0-13ubuntu1 [30.9 kB] 131s Get:19 http://ftpmaster.internal/ubuntu plucky/main s390x libasan8 s390x 14.2.0-13ubuntu1 [2964 kB] 131s Get:20 http://ftpmaster.internal/ubuntu plucky/main s390x libubsan1 s390x 14.2.0-13ubuntu1 [1184 kB] 131s Get:21 http://ftpmaster.internal/ubuntu plucky/main s390x libgcc-14-dev s390x 14.2.0-13ubuntu1 [1037 kB] 131s Get:22 http://ftpmaster.internal/ubuntu plucky/main s390x gcc-14-s390x-linux-gnu s390x 14.2.0-13ubuntu1 [18.7 MB] 131s Get:23 http://ftpmaster.internal/ubuntu plucky/main s390x gcc-14 s390x 14.2.0-13ubuntu1 [523 kB] 131s Get:24 http://ftpmaster.internal/ubuntu plucky/main s390x gcc-s390x-linux-gnu s390x 4:14.1.0-2ubuntu1 [1204 B] 131s Get:25 http://ftpmaster.internal/ubuntu plucky/main s390x gcc s390x 4:14.1.0-2ubuntu1 [4996 B] 131s Get:26 http://ftpmaster.internal/ubuntu plucky/main s390x libstdc++-14-dev s390x 14.2.0-13ubuntu1 [2612 kB] 131s Get:27 http://ftpmaster.internal/ubuntu plucky/main s390x g++-14-s390x-linux-gnu s390x 14.2.0-13ubuntu1 [11.0 MB] 132s Get:28 http://ftpmaster.internal/ubuntu plucky/main s390x g++-14 s390x 14.2.0-13ubuntu1 [21.1 kB] 132s Get:29 http://ftpmaster.internal/ubuntu plucky/main s390x g++-s390x-linux-gnu s390x 4:14.1.0-2ubuntu1 [956 B] 132s Get:30 http://ftpmaster.internal/ubuntu plucky/main s390x g++ s390x 4:14.1.0-2ubuntu1 [1076 B] 132s Get:31 http://ftpmaster.internal/ubuntu plucky/main s390x build-essential s390x 12.10ubuntu1 [4930 B] 132s Get:32 http://ftpmaster.internal/ubuntu plucky/main s390x libdebhelper-perl all 13.20ubuntu1 [94.2 kB] 132s Get:33 http://ftpmaster.internal/ubuntu plucky/main s390x libtool all 2.4.7-8 [166 kB] 132s Get:34 http://ftpmaster.internal/ubuntu plucky/main s390x dh-autoreconf all 20 [16.1 kB] 132s Get:35 http://ftpmaster.internal/ubuntu plucky/main s390x libarchive-zip-perl all 1.68-1 [90.2 kB] 132s Get:36 http://ftpmaster.internal/ubuntu plucky/main s390x libfile-stripnondeterminism-perl all 1.14.0-1 [20.1 kB] 132s Get:37 http://ftpmaster.internal/ubuntu plucky/main s390x dh-strip-nondeterminism all 1.14.0-1 [5058 B] 132s Get:38 http://ftpmaster.internal/ubuntu plucky/main s390x debugedit s390x 1:5.1-1 [49.9 kB] 132s Get:39 http://ftpmaster.internal/ubuntu plucky/main s390x dwz s390x 0.15-1build6 [122 kB] 132s Get:40 http://ftpmaster.internal/ubuntu plucky/main s390x gettext s390x 0.22.5-3 [997 kB] 132s Get:41 http://ftpmaster.internal/ubuntu plucky/main s390x intltool-debian all 0.35.0+20060710.6 [23.2 kB] 132s Get:42 http://ftpmaster.internal/ubuntu plucky/main s390x po-debconf all 1.0.21+nmu1 [233 kB] 132s Get:43 http://ftpmaster.internal/ubuntu plucky/main s390x debhelper all 13.20ubuntu1 [893 kB] 132s Get:44 http://ftpmaster.internal/ubuntu plucky/universe s390x dh-python all 6.20241217 [117 kB] 132s Get:45 http://ftpmaster.internal/ubuntu plucky/main s390x xml-core all 0.19 [20.3 kB] 132s Get:46 http://ftpmaster.internal/ubuntu plucky/main s390x docutils-common all 0.21.2+dfsg-2 [131 kB] 132s Get:47 http://ftpmaster.internal/ubuntu plucky/main s390x fonts-font-awesome all 5.0.10+really4.7.0~dfsg-4.1 [516 kB] 132s Get:48 http://ftpmaster.internal/ubuntu plucky/main s390x libjs-jquery all 3.6.1+dfsg+~3.5.14-1 [328 kB] 132s Get:49 http://ftpmaster.internal/ubuntu plucky/main s390x libjs-underscore all 1.13.4~dfsg+~1.11.4-3 [118 kB] 132s Get:50 http://ftpmaster.internal/ubuntu plucky/main s390x libjs-sphinxdoc all 8.1.3-3 [30.9 kB] 132s Get:51 http://ftpmaster.internal/ubuntu plucky/main s390x libjson-perl all 4.10000-1 [81.9 kB] 132s Get:52 http://ftpmaster.internal/ubuntu plucky/main s390x liblua5.4-0 s390x 5.4.7-1 [174 kB] 132s Get:53 http://ftpmaster.internal/ubuntu plucky/main s390x libpython3.13-stdlib s390x 3.13.1-2 [2074 kB] 132s Get:54 http://ftpmaster.internal/ubuntu plucky/universe s390x pandoc-data all 3.1.11.1-3build1 [78.8 kB] 132s Get:55 http://ftpmaster.internal/ubuntu plucky/universe s390x pandoc s390x 3.1.11.1+ds-2 [52.5 MB] 135s Get:56 http://ftpmaster.internal/ubuntu plucky/universe s390x pybuild-plugin-autopkgtest all 6.20241217 [1746 B] 135s Get:57 http://ftpmaster.internal/ubuntu plucky/universe s390x python-vcr-doc all 6.0.2-2 [184 kB] 135s Get:58 http://ftpmaster.internal/ubuntu plucky/universe s390x python3-aiohappyeyeballs all 2.4.4-2 [10.6 kB] 135s Get:59 http://ftpmaster.internal/ubuntu plucky/universe s390x python3-multidict s390x 6.1.0-1build1 [38.4 kB] 135s Get:60 http://ftpmaster.internal/ubuntu plucky/universe s390x python3-yarl s390x 1.13.1-1build1 [122 kB] 135s Get:61 http://ftpmaster.internal/ubuntu plucky/universe s390x python3-async-timeout all 5.0.1-1 [6830 B] 135s Get:62 http://ftpmaster.internal/ubuntu plucky/universe s390x python3-frozenlist s390x 1.5.0-1build1 [64.2 kB] 135s Get:63 http://ftpmaster.internal/ubuntu plucky/universe s390x python3-aiosignal all 1.3.2-1 [5182 B] 135s Get:64 http://ftpmaster.internal/ubuntu plucky/universe s390x python3-aiohttp s390x 3.10.11-1 [363 kB] 135s Get:65 http://ftpmaster.internal/ubuntu plucky/main s390x python3.13 s390x 3.13.1-2 [729 kB] 135s Get:66 http://ftpmaster.internal/ubuntu plucky/main s390x python3-all s390x 3.12.8-1 [894 B] 135s Get:67 http://ftpmaster.internal/ubuntu plucky/main s390x python3-dateutil all 2.9.0-3 [80.2 kB] 135s Get:68 http://ftpmaster.internal/ubuntu plucky/main s390x python3-jmespath all 1.0.1-1 [21.3 kB] 135s Get:69 http://ftpmaster.internal/ubuntu plucky/main s390x python3-six all 1.17.0-1 [13.2 kB] 135s Get:70 http://ftpmaster.internal/ubuntu plucky/main s390x python3-botocore all 1.34.46+repack-1ubuntu1 [6211 kB] 135s Get:71 http://ftpmaster.internal/ubuntu plucky/main s390x python3-s3transfer all 0.10.1-1ubuntu2 [54.3 kB] 135s Get:72 http://ftpmaster.internal/ubuntu plucky/main s390x python3-boto3 all 1.34.46+dfsg-1ubuntu1 [72.5 kB] 135s Get:73 http://ftpmaster.internal/ubuntu plucky/universe s390x python3-brotli s390x 1.1.0-2build3 [381 kB] 135s Get:74 http://ftpmaster.internal/ubuntu plucky/universe s390x python3-brotlicffi s390x 1.1.0.0+ds1-1 [18.8 kB] 135s Get:75 http://ftpmaster.internal/ubuntu plucky/main s390x python3-click all 8.1.8-1 [79.8 kB] 135s Get:76 http://ftpmaster.internal/ubuntu plucky/main s390x python3-decorator all 5.1.1-5 [10.1 kB] 135s Get:77 http://ftpmaster.internal/ubuntu plucky/main s390x python3-defusedxml all 0.7.1-3 [42.2 kB] 135s Get:78 http://ftpmaster.internal/ubuntu plucky/main s390x python3-roman all 4.2-1 [10.0 kB] 135s Get:79 http://ftpmaster.internal/ubuntu plucky/main s390x python3-docutils all 0.21.2+dfsg-2 [409 kB] 135s Get:80 http://ftpmaster.internal/ubuntu plucky/main s390x python3-itsdangerous all 2.2.0-1 [15.2 kB] 135s Get:81 http://ftpmaster.internal/ubuntu plucky/main s390x python3-werkzeug all 3.1.3-2 [169 kB] 135s Get:82 http://ftpmaster.internal/ubuntu plucky/main s390x python3-flask all 3.1.0-2ubuntu1 [84.4 kB] 135s Get:83 http://ftpmaster.internal/ubuntu plucky/universe s390x python3-mistune all 3.0.2-2 [32.9 kB] 135s Get:84 http://ftpmaster.internal/ubuntu plucky/main s390x python3-packaging all 24.2-1 [51.5 kB] 135s Get:85 http://ftpmaster.internal/ubuntu plucky/universe s390x python3-flasgger all 0.9.7.2~dev2+dfsg-3 [1693 kB] 135s Get:86 http://ftpmaster.internal/ubuntu plucky/main s390x python3-greenlet s390x 3.1.0-1 [176 kB] 135s Get:87 http://ftpmaster.internal/ubuntu plucky/universe s390x python3-httpbin all 0.10.2+dfsg-2 [89.0 kB] 135s Get:88 http://ftpmaster.internal/ubuntu plucky/main s390x python3-imagesize all 1.4.1-1 [6844 B] 135s Get:89 http://ftpmaster.internal/ubuntu plucky/universe s390x python3-iniconfig all 1.1.1-2 [6024 B] 135s Get:90 http://ftpmaster.internal/ubuntu plucky/universe s390x python3-pluggy all 1.5.0-1 [21.0 kB] 135s Get:91 http://ftpmaster.internal/ubuntu plucky/universe s390x python3-pytest all 8.3.4-1 [252 kB] 135s Get:92 http://ftpmaster.internal/ubuntu plucky/universe s390x python3-pytest-httpbin all 2.1.0-1 [13.0 kB] 135s Get:93 http://ftpmaster.internal/ubuntu plucky/main s390x python3-tornado s390x 6.4.1-3 [298 kB] 135s Get:94 http://ftpmaster.internal/ubuntu plucky/universe s390x python3-pytest-tornado all 0.8.1-3 [7180 B] 135s Get:95 http://ftpmaster.internal/ubuntu plucky/main s390x python3-snowballstemmer all 2.2.0-4build1 [59.8 kB] 135s Get:96 http://ftpmaster.internal/ubuntu plucky/main s390x sphinx-common all 8.1.3-3 [661 kB] 135s Get:97 http://ftpmaster.internal/ubuntu plucky/main s390x python3-alabaster all 0.7.16-0.1 [18.5 kB] 135s Get:98 http://ftpmaster.internal/ubuntu plucky/main s390x python3-sphinx all 8.1.3-3 [474 kB] 136s Get:99 http://ftpmaster.internal/ubuntu plucky/main s390x sphinx-rtd-theme-common all 3.0.2+dfsg-1 [1014 kB] 136s Get:100 http://ftpmaster.internal/ubuntu plucky/universe s390x python3-sphinxcontrib.jquery all 4.1-5 [6678 B] 136s Get:101 http://ftpmaster.internal/ubuntu plucky/universe s390x python3-sphinx-rtd-theme all 3.0.2+dfsg-1 [23.5 kB] 136s Get:102 http://ftpmaster.internal/ubuntu plucky/main s390x python3-wrapt s390x 1.15.0-4 [34.4 kB] 136s Get:103 http://ftpmaster.internal/ubuntu plucky/universe s390x python3-vcr all 6.0.2-2 [33.0 kB] 136s Fetched 130 MB in 6s (20.0 MB/s) 136s Selecting previously unselected package fonts-lato. 136s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 55661 files and directories currently installed.) 136s Preparing to unpack .../000-fonts-lato_2.015-1_all.deb ... 136s Unpacking fonts-lato (2.015-1) ... 137s Selecting previously unselected package libpython3.13-minimal:s390x. 137s Preparing to unpack .../001-libpython3.13-minimal_3.13.1-2_s390x.deb ... 137s Unpacking libpython3.13-minimal:s390x (3.13.1-2) ... 137s Selecting previously unselected package python3.13-minimal. 137s Preparing to unpack .../002-python3.13-minimal_3.13.1-2_s390x.deb ... 137s Unpacking python3.13-minimal (3.13.1-2) ... 137s Selecting previously unselected package sgml-base. 137s Preparing to unpack .../003-sgml-base_1.31_all.deb ... 137s Unpacking sgml-base (1.31) ... 137s Selecting previously unselected package m4. 137s Preparing to unpack .../004-m4_1.4.19-4build1_s390x.deb ... 137s Unpacking m4 (1.4.19-4build1) ... 137s Selecting previously unselected package autoconf. 137s Preparing to unpack .../005-autoconf_2.72-3_all.deb ... 137s Unpacking autoconf (2.72-3) ... 137s Selecting previously unselected package autotools-dev. 137s Preparing to unpack .../006-autotools-dev_20220109.1_all.deb ... 137s Unpacking autotools-dev (20220109.1) ... 137s Selecting previously unselected package automake. 137s Preparing to unpack .../007-automake_1%3a1.16.5-1.3ubuntu1_all.deb ... 137s Unpacking automake (1:1.16.5-1.3ubuntu1) ... 137s Selecting previously unselected package autopoint. 137s Preparing to unpack .../008-autopoint_0.22.5-3_all.deb ... 137s Unpacking autopoint (0.22.5-3) ... 137s Selecting previously unselected package libisl23:s390x. 137s Preparing to unpack .../009-libisl23_0.27-1_s390x.deb ... 137s Unpacking libisl23:s390x (0.27-1) ... 137s Selecting previously unselected package libmpc3:s390x. 137s Preparing to unpack .../010-libmpc3_1.3.1-1build2_s390x.deb ... 137s Unpacking libmpc3:s390x (1.3.1-1build2) ... 137s Selecting previously unselected package cpp-14-s390x-linux-gnu. 137s Preparing to unpack .../011-cpp-14-s390x-linux-gnu_14.2.0-13ubuntu1_s390x.deb ... 137s Unpacking cpp-14-s390x-linux-gnu (14.2.0-13ubuntu1) ... 137s Selecting previously unselected package cpp-14. 137s Preparing to unpack .../012-cpp-14_14.2.0-13ubuntu1_s390x.deb ... 137s Unpacking cpp-14 (14.2.0-13ubuntu1) ... 137s Selecting previously unselected package cpp-s390x-linux-gnu. 137s Preparing to unpack .../013-cpp-s390x-linux-gnu_4%3a14.1.0-2ubuntu1_s390x.deb ... 137s Unpacking cpp-s390x-linux-gnu (4:14.1.0-2ubuntu1) ... 137s Selecting previously unselected package cpp. 137s Preparing to unpack .../014-cpp_4%3a14.1.0-2ubuntu1_s390x.deb ... 137s Unpacking cpp (4:14.1.0-2ubuntu1) ... 137s Selecting previously unselected package libcc1-0:s390x. 137s Preparing to unpack .../015-libcc1-0_14.2.0-13ubuntu1_s390x.deb ... 137s Unpacking libcc1-0:s390x (14.2.0-13ubuntu1) ... 137s Selecting previously unselected package libgomp1:s390x. 137s Preparing to unpack .../016-libgomp1_14.2.0-13ubuntu1_s390x.deb ... 137s Unpacking libgomp1:s390x (14.2.0-13ubuntu1) ... 137s Selecting previously unselected package libitm1:s390x. 137s Preparing to unpack .../017-libitm1_14.2.0-13ubuntu1_s390x.deb ... 137s Unpacking libitm1:s390x (14.2.0-13ubuntu1) ... 137s Selecting previously unselected package libasan8:s390x. 137s Preparing to unpack .../018-libasan8_14.2.0-13ubuntu1_s390x.deb ... 137s Unpacking libasan8:s390x (14.2.0-13ubuntu1) ... 137s Selecting previously unselected package libubsan1:s390x. 137s Preparing to unpack .../019-libubsan1_14.2.0-13ubuntu1_s390x.deb ... 137s Unpacking libubsan1:s390x (14.2.0-13ubuntu1) ... 137s Selecting previously unselected package libgcc-14-dev:s390x. 137s Preparing to unpack .../020-libgcc-14-dev_14.2.0-13ubuntu1_s390x.deb ... 137s Unpacking libgcc-14-dev:s390x (14.2.0-13ubuntu1) ... 137s Selecting previously unselected package gcc-14-s390x-linux-gnu. 137s Preparing to unpack .../021-gcc-14-s390x-linux-gnu_14.2.0-13ubuntu1_s390x.deb ... 137s Unpacking gcc-14-s390x-linux-gnu (14.2.0-13ubuntu1) ... 137s Selecting previously unselected package gcc-14. 137s Preparing to unpack .../022-gcc-14_14.2.0-13ubuntu1_s390x.deb ... 137s Unpacking gcc-14 (14.2.0-13ubuntu1) ... 137s Selecting previously unselected package gcc-s390x-linux-gnu. 137s Preparing to unpack .../023-gcc-s390x-linux-gnu_4%3a14.1.0-2ubuntu1_s390x.deb ... 137s Unpacking gcc-s390x-linux-gnu (4:14.1.0-2ubuntu1) ... 137s Selecting previously unselected package gcc. 137s Preparing to unpack .../024-gcc_4%3a14.1.0-2ubuntu1_s390x.deb ... 137s Unpacking gcc (4:14.1.0-2ubuntu1) ... 137s Selecting previously unselected package libstdc++-14-dev:s390x. 137s Preparing to unpack .../025-libstdc++-14-dev_14.2.0-13ubuntu1_s390x.deb ... 137s Unpacking libstdc++-14-dev:s390x (14.2.0-13ubuntu1) ... 137s Selecting previously unselected package g++-14-s390x-linux-gnu. 137s Preparing to unpack .../026-g++-14-s390x-linux-gnu_14.2.0-13ubuntu1_s390x.deb ... 137s Unpacking g++-14-s390x-linux-gnu (14.2.0-13ubuntu1) ... 138s Selecting previously unselected package g++-14. 138s Preparing to unpack .../027-g++-14_14.2.0-13ubuntu1_s390x.deb ... 138s Unpacking g++-14 (14.2.0-13ubuntu1) ... 138s Selecting previously unselected package g++-s390x-linux-gnu. 138s Preparing to unpack .../028-g++-s390x-linux-gnu_4%3a14.1.0-2ubuntu1_s390x.deb ... 138s Unpacking g++-s390x-linux-gnu (4:14.1.0-2ubuntu1) ... 138s Selecting previously unselected package g++. 138s Preparing to unpack .../029-g++_4%3a14.1.0-2ubuntu1_s390x.deb ... 138s Unpacking g++ (4:14.1.0-2ubuntu1) ... 138s Selecting previously unselected package build-essential. 138s Preparing to unpack .../030-build-essential_12.10ubuntu1_s390x.deb ... 138s Unpacking build-essential (12.10ubuntu1) ... 138s Selecting previously unselected package libdebhelper-perl. 138s Preparing to unpack .../031-libdebhelper-perl_13.20ubuntu1_all.deb ... 138s Unpacking libdebhelper-perl (13.20ubuntu1) ... 138s Selecting previously unselected package libtool. 138s Preparing to unpack .../032-libtool_2.4.7-8_all.deb ... 138s Unpacking libtool (2.4.7-8) ... 138s Selecting previously unselected package dh-autoreconf. 138s Preparing to unpack .../033-dh-autoreconf_20_all.deb ... 138s Unpacking dh-autoreconf (20) ... 138s Selecting previously unselected package libarchive-zip-perl. 138s Preparing to unpack .../034-libarchive-zip-perl_1.68-1_all.deb ... 138s Unpacking libarchive-zip-perl (1.68-1) ... 138s Selecting previously unselected package libfile-stripnondeterminism-perl. 138s Preparing to unpack .../035-libfile-stripnondeterminism-perl_1.14.0-1_all.deb ... 138s Unpacking libfile-stripnondeterminism-perl (1.14.0-1) ... 138s Selecting previously unselected package dh-strip-nondeterminism. 138s Preparing to unpack .../036-dh-strip-nondeterminism_1.14.0-1_all.deb ... 138s Unpacking dh-strip-nondeterminism (1.14.0-1) ... 138s Selecting previously unselected package debugedit. 138s Preparing to unpack .../037-debugedit_1%3a5.1-1_s390x.deb ... 138s Unpacking debugedit (1:5.1-1) ... 138s Selecting previously unselected package dwz. 138s Preparing to unpack .../038-dwz_0.15-1build6_s390x.deb ... 138s Unpacking dwz (0.15-1build6) ... 138s Selecting previously unselected package gettext. 138s Preparing to unpack .../039-gettext_0.22.5-3_s390x.deb ... 138s Unpacking gettext (0.22.5-3) ... 138s Selecting previously unselected package intltool-debian. 138s Preparing to unpack .../040-intltool-debian_0.35.0+20060710.6_all.deb ... 138s Unpacking intltool-debian (0.35.0+20060710.6) ... 138s Selecting previously unselected package po-debconf. 138s Preparing to unpack .../041-po-debconf_1.0.21+nmu1_all.deb ... 138s Unpacking po-debconf (1.0.21+nmu1) ... 138s Selecting previously unselected package debhelper. 138s Preparing to unpack .../042-debhelper_13.20ubuntu1_all.deb ... 138s Unpacking debhelper (13.20ubuntu1) ... 138s Selecting previously unselected package dh-python. 138s Preparing to unpack .../043-dh-python_6.20241217_all.deb ... 138s Unpacking dh-python (6.20241217) ... 138s Selecting previously unselected package xml-core. 138s Preparing to unpack .../044-xml-core_0.19_all.deb ... 138s Unpacking xml-core (0.19) ... 138s Selecting previously unselected package docutils-common. 138s Preparing to unpack .../045-docutils-common_0.21.2+dfsg-2_all.deb ... 138s Unpacking docutils-common (0.21.2+dfsg-2) ... 138s Selecting previously unselected package fonts-font-awesome. 138s Preparing to unpack .../046-fonts-font-awesome_5.0.10+really4.7.0~dfsg-4.1_all.deb ... 138s Unpacking fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 138s Selecting previously unselected package libjs-jquery. 138s Preparing to unpack .../047-libjs-jquery_3.6.1+dfsg+~3.5.14-1_all.deb ... 138s Unpacking libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 138s Selecting previously unselected package libjs-underscore. 138s Preparing to unpack .../048-libjs-underscore_1.13.4~dfsg+~1.11.4-3_all.deb ... 138s Unpacking libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 138s Selecting previously unselected package libjs-sphinxdoc. 138s Preparing to unpack .../049-libjs-sphinxdoc_8.1.3-3_all.deb ... 138s Unpacking libjs-sphinxdoc (8.1.3-3) ... 138s Selecting previously unselected package libjson-perl. 138s Preparing to unpack .../050-libjson-perl_4.10000-1_all.deb ... 138s Unpacking libjson-perl (4.10000-1) ... 138s Selecting previously unselected package liblua5.4-0:s390x. 138s Preparing to unpack .../051-liblua5.4-0_5.4.7-1_s390x.deb ... 138s Unpacking liblua5.4-0:s390x (5.4.7-1) ... 138s Selecting previously unselected package libpython3.13-stdlib:s390x. 138s Preparing to unpack .../052-libpython3.13-stdlib_3.13.1-2_s390x.deb ... 138s Unpacking libpython3.13-stdlib:s390x (3.13.1-2) ... 138s Selecting previously unselected package pandoc-data. 138s Preparing to unpack .../053-pandoc-data_3.1.11.1-3build1_all.deb ... 138s Unpacking pandoc-data (3.1.11.1-3build1) ... 138s Selecting previously unselected package pandoc. 138s Preparing to unpack .../054-pandoc_3.1.11.1+ds-2_s390x.deb ... 138s Unpacking pandoc (3.1.11.1+ds-2) ... 139s Selecting previously unselected package pybuild-plugin-autopkgtest. 139s Preparing to unpack .../055-pybuild-plugin-autopkgtest_6.20241217_all.deb ... 139s Unpacking pybuild-plugin-autopkgtest (6.20241217) ... 139s Selecting previously unselected package python-vcr-doc. 139s Preparing to unpack .../056-python-vcr-doc_6.0.2-2_all.deb ... 139s Unpacking python-vcr-doc (6.0.2-2) ... 139s Selecting previously unselected package python3-aiohappyeyeballs. 139s Preparing to unpack .../057-python3-aiohappyeyeballs_2.4.4-2_all.deb ... 139s Unpacking python3-aiohappyeyeballs (2.4.4-2) ... 139s Selecting previously unselected package python3-multidict. 139s Preparing to unpack .../058-python3-multidict_6.1.0-1build1_s390x.deb ... 139s Unpacking python3-multidict (6.1.0-1build1) ... 139s Selecting previously unselected package python3-yarl. 139s Preparing to unpack .../059-python3-yarl_1.13.1-1build1_s390x.deb ... 139s Unpacking python3-yarl (1.13.1-1build1) ... 139s Selecting previously unselected package python3-async-timeout. 139s Preparing to unpack .../060-python3-async-timeout_5.0.1-1_all.deb ... 139s Unpacking python3-async-timeout (5.0.1-1) ... 139s Selecting previously unselected package python3-frozenlist. 139s Preparing to unpack .../061-python3-frozenlist_1.5.0-1build1_s390x.deb ... 139s Unpacking python3-frozenlist (1.5.0-1build1) ... 139s Selecting previously unselected package python3-aiosignal. 139s Preparing to unpack .../062-python3-aiosignal_1.3.2-1_all.deb ... 139s Unpacking python3-aiosignal (1.3.2-1) ... 139s Selecting previously unselected package python3-aiohttp. 139s Preparing to unpack .../063-python3-aiohttp_3.10.11-1_s390x.deb ... 139s Unpacking python3-aiohttp (3.10.11-1) ... 139s Selecting previously unselected package python3.13. 139s Preparing to unpack .../064-python3.13_3.13.1-2_s390x.deb ... 139s Unpacking python3.13 (3.13.1-2) ... 139s Selecting previously unselected package python3-all. 139s Preparing to unpack .../065-python3-all_3.12.8-1_s390x.deb ... 139s Unpacking python3-all (3.12.8-1) ... 139s Selecting previously unselected package python3-dateutil. 139s Preparing to unpack .../066-python3-dateutil_2.9.0-3_all.deb ... 139s Unpacking python3-dateutil (2.9.0-3) ... 139s Selecting previously unselected package python3-jmespath. 139s Preparing to unpack .../067-python3-jmespath_1.0.1-1_all.deb ... 139s Unpacking python3-jmespath (1.0.1-1) ... 139s Selecting previously unselected package python3-six. 139s Preparing to unpack .../068-python3-six_1.17.0-1_all.deb ... 139s Unpacking python3-six (1.17.0-1) ... 139s Selecting previously unselected package python3-botocore. 139s Preparing to unpack .../069-python3-botocore_1.34.46+repack-1ubuntu1_all.deb ... 139s Unpacking python3-botocore (1.34.46+repack-1ubuntu1) ... 140s Selecting previously unselected package python3-s3transfer. 140s Preparing to unpack .../070-python3-s3transfer_0.10.1-1ubuntu2_all.deb ... 140s Unpacking python3-s3transfer (0.10.1-1ubuntu2) ... 140s Selecting previously unselected package python3-boto3. 140s Preparing to unpack .../071-python3-boto3_1.34.46+dfsg-1ubuntu1_all.deb ... 140s Unpacking python3-boto3 (1.34.46+dfsg-1ubuntu1) ... 140s Selecting previously unselected package python3-brotli. 140s Preparing to unpack .../072-python3-brotli_1.1.0-2build3_s390x.deb ... 140s Unpacking python3-brotli (1.1.0-2build3) ... 140s Selecting previously unselected package python3-brotlicffi. 140s Preparing to unpack .../073-python3-brotlicffi_1.1.0.0+ds1-1_s390x.deb ... 140s Unpacking python3-brotlicffi (1.1.0.0+ds1-1) ... 140s Selecting previously unselected package python3-click. 140s Preparing to unpack .../074-python3-click_8.1.8-1_all.deb ... 140s Unpacking python3-click (8.1.8-1) ... 140s Selecting previously unselected package python3-decorator. 140s Preparing to unpack .../075-python3-decorator_5.1.1-5_all.deb ... 140s Unpacking python3-decorator (5.1.1-5) ... 140s Selecting previously unselected package python3-defusedxml. 140s Preparing to unpack .../076-python3-defusedxml_0.7.1-3_all.deb ... 140s Unpacking python3-defusedxml (0.7.1-3) ... 140s Selecting previously unselected package python3-roman. 140s Preparing to unpack .../077-python3-roman_4.2-1_all.deb ... 140s Unpacking python3-roman (4.2-1) ... 140s Selecting previously unselected package python3-docutils. 140s Preparing to unpack .../078-python3-docutils_0.21.2+dfsg-2_all.deb ... 140s Unpacking python3-docutils (0.21.2+dfsg-2) ... 140s Selecting previously unselected package python3-itsdangerous. 140s Preparing to unpack .../079-python3-itsdangerous_2.2.0-1_all.deb ... 140s Unpacking python3-itsdangerous (2.2.0-1) ... 140s Selecting previously unselected package python3-werkzeug. 140s Preparing to unpack .../080-python3-werkzeug_3.1.3-2_all.deb ... 140s Unpacking python3-werkzeug (3.1.3-2) ... 140s Selecting previously unselected package python3-flask. 140s Preparing to unpack .../081-python3-flask_3.1.0-2ubuntu1_all.deb ... 140s Unpacking python3-flask (3.1.0-2ubuntu1) ... 140s Selecting previously unselected package python3-mistune. 140s Preparing to unpack .../082-python3-mistune_3.0.2-2_all.deb ... 140s Unpacking python3-mistune (3.0.2-2) ... 140s Selecting previously unselected package python3-packaging. 140s Preparing to unpack .../083-python3-packaging_24.2-1_all.deb ... 140s Unpacking python3-packaging (24.2-1) ... 140s Selecting previously unselected package python3-flasgger. 140s Preparing to unpack .../084-python3-flasgger_0.9.7.2~dev2+dfsg-3_all.deb ... 140s Unpacking python3-flasgger (0.9.7.2~dev2+dfsg-3) ... 140s Selecting previously unselected package python3-greenlet. 140s Preparing to unpack .../085-python3-greenlet_3.1.0-1_s390x.deb ... 140s Unpacking python3-greenlet (3.1.0-1) ... 140s Selecting previously unselected package python3-httpbin. 140s Preparing to unpack .../086-python3-httpbin_0.10.2+dfsg-2_all.deb ... 140s Unpacking python3-httpbin (0.10.2+dfsg-2) ... 140s Selecting previously unselected package python3-imagesize. 140s Preparing to unpack .../087-python3-imagesize_1.4.1-1_all.deb ... 140s Unpacking python3-imagesize (1.4.1-1) ... 140s Selecting previously unselected package python3-iniconfig. 140s Preparing to unpack .../088-python3-iniconfig_1.1.1-2_all.deb ... 140s Unpacking python3-iniconfig (1.1.1-2) ... 140s Selecting previously unselected package python3-pluggy. 140s Preparing to unpack .../089-python3-pluggy_1.5.0-1_all.deb ... 140s Unpacking python3-pluggy (1.5.0-1) ... 140s Selecting previously unselected package python3-pytest. 140s Preparing to unpack .../090-python3-pytest_8.3.4-1_all.deb ... 140s Unpacking python3-pytest (8.3.4-1) ... 140s Selecting previously unselected package python3-pytest-httpbin. 140s Preparing to unpack .../091-python3-pytest-httpbin_2.1.0-1_all.deb ... 140s Unpacking python3-pytest-httpbin (2.1.0-1) ... 140s Selecting previously unselected package python3-tornado. 140s Preparing to unpack .../092-python3-tornado_6.4.1-3_s390x.deb ... 140s Unpacking python3-tornado (6.4.1-3) ... 140s Selecting previously unselected package python3-pytest-tornado. 140s Preparing to unpack .../093-python3-pytest-tornado_0.8.1-3_all.deb ... 140s Unpacking python3-pytest-tornado (0.8.1-3) ... 140s Selecting previously unselected package python3-snowballstemmer. 140s Preparing to unpack .../094-python3-snowballstemmer_2.2.0-4build1_all.deb ... 140s Unpacking python3-snowballstemmer (2.2.0-4build1) ... 140s Selecting previously unselected package sphinx-common. 140s Preparing to unpack .../095-sphinx-common_8.1.3-3_all.deb ... 140s Unpacking sphinx-common (8.1.3-3) ... 140s Selecting previously unselected package python3-alabaster. 140s Preparing to unpack .../096-python3-alabaster_0.7.16-0.1_all.deb ... 140s Unpacking python3-alabaster (0.7.16-0.1) ... 140s Selecting previously unselected package python3-sphinx. 140s Preparing to unpack .../097-python3-sphinx_8.1.3-3_all.deb ... 140s Unpacking python3-sphinx (8.1.3-3) ... 140s Selecting previously unselected package sphinx-rtd-theme-common. 140s Preparing to unpack .../098-sphinx-rtd-theme-common_3.0.2+dfsg-1_all.deb ... 140s Unpacking sphinx-rtd-theme-common (3.0.2+dfsg-1) ... 140s Selecting previously unselected package python3-sphinxcontrib.jquery. 140s Preparing to unpack .../099-python3-sphinxcontrib.jquery_4.1-5_all.deb ... 140s Unpacking python3-sphinxcontrib.jquery (4.1-5) ... 140s Selecting previously unselected package python3-sphinx-rtd-theme. 140s Preparing to unpack .../100-python3-sphinx-rtd-theme_3.0.2+dfsg-1_all.deb ... 140s Unpacking python3-sphinx-rtd-theme (3.0.2+dfsg-1) ... 140s Selecting previously unselected package python3-wrapt. 140s Preparing to unpack .../101-python3-wrapt_1.15.0-4_s390x.deb ... 140s Unpacking python3-wrapt (1.15.0-4) ... 140s Selecting previously unselected package python3-vcr. 140s Preparing to unpack .../102-python3-vcr_6.0.2-2_all.deb ... 140s Unpacking python3-vcr (6.0.2-2) ... 140s Setting up dh-python (6.20241217) ... 140s Setting up python3-iniconfig (1.1.1-2) ... 141s Setting up python3-tornado (6.4.1-3) ... 141s Setting up python3-brotlicffi (1.1.0.0+ds1-1) ... 141s Setting up fonts-lato (2.015-1) ... 141s Setting up python3-defusedxml (0.7.1-3) ... 141s Setting up libarchive-zip-perl (1.68-1) ... 141s Setting up python3-alabaster (0.7.16-0.1) ... 141s Setting up libdebhelper-perl (13.20ubuntu1) ... 141s Setting up m4 (1.4.19-4build1) ... 141s Setting up python3-itsdangerous (2.2.0-1) ... 142s Setting up libgomp1:s390x (14.2.0-13ubuntu1) ... 142s Setting up python3-click (8.1.8-1) ... 142s Setting up python3-multidict (6.1.0-1build1) ... 142s Setting up python3-frozenlist (1.5.0-1build1) ... 142s Setting up python3-aiosignal (1.3.2-1) ... 142s Setting up python3-async-timeout (5.0.1-1) ... 142s Setting up python3-six (1.17.0-1) ... 142s Setting up libpython3.13-minimal:s390x (3.13.1-2) ... 142s Setting up python3-roman (4.2-1) ... 143s Setting up python3-decorator (5.1.1-5) ... 143s Setting up autotools-dev (20220109.1) ... 143s Setting up python3-packaging (24.2-1) ... 143s Setting up python3-snowballstemmer (2.2.0-4build1) ... 143s Setting up python3-werkzeug (3.1.3-2) ... 144s Setting up python3-jmespath (1.0.1-1) ... 144s Setting up python3-brotli (1.1.0-2build3) ... 144s Setting up python3-greenlet (3.1.0-1) ... 144s Setting up libmpc3:s390x (1.3.1-1build2) ... 144s Setting up python3-wrapt (1.15.0-4) ... 144s Setting up autopoint (0.22.5-3) ... 144s Setting up python3-aiohappyeyeballs (2.4.4-2) ... 144s Setting up autoconf (2.72-3) ... 144s Setting up python3-pluggy (1.5.0-1) ... 145s Setting up libubsan1:s390x (14.2.0-13ubuntu1) ... 145s Setting up dwz (0.15-1build6) ... 145s Setting up libasan8:s390x (14.2.0-13ubuntu1) ... 145s Setting up libjson-perl (4.10000-1) ... 145s Setting up debugedit (1:5.1-1) ... 145s Setting up liblua5.4-0:s390x (5.4.7-1) ... 145s Setting up python3.13-minimal (3.13.1-2) ... 145s Setting up python3-dateutil (2.9.0-3) ... 145s Setting up sgml-base (1.31) ... 145s Setting up pandoc-data (3.1.11.1-3build1) ... 145s Setting up libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 145s Setting up libisl23:s390x (0.27-1) ... 145s Setting up python3-yarl (1.13.1-1build1) ... 145s Setting up python3-mistune (3.0.2-2) ... 146s Setting up libpython3.13-stdlib:s390x (3.13.1-2) ... 146s Setting up fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 146s Setting up sphinx-rtd-theme-common (3.0.2+dfsg-1) ... 146s Setting up libcc1-0:s390x (14.2.0-13ubuntu1) ... 146s Setting up libitm1:s390x (14.2.0-13ubuntu1) ... 146s Setting up libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 146s Setting up python3-imagesize (1.4.1-1) ... 146s Setting up automake (1:1.16.5-1.3ubuntu1) ... 146s update-alternatives: using /usr/bin/automake-1.16 to provide /usr/bin/automake (automake) in auto mode 146s Setting up libfile-stripnondeterminism-perl (1.14.0-1) ... 146s Setting up gettext (0.22.5-3) ... 146s Setting up python3.13 (3.13.1-2) ... 146s Setting up python3-pytest (8.3.4-1) ... 147s Setting up python3-flask (3.1.0-2ubuntu1) ... 147s Setting up python3-aiohttp (3.10.11-1) ... 148s Setting up python3-all (3.12.8-1) ... 148s Setting up intltool-debian (0.35.0+20060710.6) ... 148s Setting up pandoc (3.1.11.1+ds-2) ... 148s Setting up python3-pytest-tornado (0.8.1-3) ... 148s Setting up python3-botocore (1.34.46+repack-1ubuntu1) ... 148s Setting up python3-vcr (6.0.2-2) ... 148s Setting up libjs-sphinxdoc (8.1.3-3) ... 148s Setting up cpp-14-s390x-linux-gnu (14.2.0-13ubuntu1) ... 148s Setting up cpp-14 (14.2.0-13ubuntu1) ... 148s Setting up dh-strip-nondeterminism (1.14.0-1) ... 148s Setting up xml-core (0.19) ... 148s Setting up libgcc-14-dev:s390x (14.2.0-13ubuntu1) ... 148s Setting up libstdc++-14-dev:s390x (14.2.0-13ubuntu1) ... 148s Setting up python-vcr-doc (6.0.2-2) ... 148s Setting up cpp-s390x-linux-gnu (4:14.1.0-2ubuntu1) ... 148s Setting up python3-flasgger (0.9.7.2~dev2+dfsg-3) ... 149s Setting up po-debconf (1.0.21+nmu1) ... 149s Setting up python3-s3transfer (0.10.1-1ubuntu2) ... 149s Setting up gcc-14-s390x-linux-gnu (14.2.0-13ubuntu1) ... 149s Setting up gcc-s390x-linux-gnu (4:14.1.0-2ubuntu1) ... 149s Setting up sphinx-common (8.1.3-3) ... 149s Setting up g++-14-s390x-linux-gnu (14.2.0-13ubuntu1) ... 149s Setting up python3-boto3 (1.34.46+dfsg-1ubuntu1) ... 149s Setting up python3-httpbin (0.10.2+dfsg-2) ... 149s Setting up cpp (4:14.1.0-2ubuntu1) ... 149s Setting up python3-pytest-httpbin (2.1.0-1) ... 149s Setting up g++-s390x-linux-gnu (4:14.1.0-2ubuntu1) ... 149s Setting up gcc-14 (14.2.0-13ubuntu1) ... 149s Setting up g++-14 (14.2.0-13ubuntu1) ... 149s Setting up libtool (2.4.7-8) ... 149s Setting up gcc (4:14.1.0-2ubuntu1) ... 149s Setting up dh-autoreconf (20) ... 149s Setting up g++ (4:14.1.0-2ubuntu1) ... 149s update-alternatives: using /usr/bin/g++ to provide /usr/bin/c++ (c++) in auto mode 149s Setting up build-essential (12.10ubuntu1) ... 149s Setting up debhelper (13.20ubuntu1) ... 149s Setting up pybuild-plugin-autopkgtest (6.20241217) ... 149s Processing triggers for install-info (7.1.1-1) ... 149s Processing triggers for libc-bin (2.40-4ubuntu1) ... 149s Processing triggers for systemd (257-2ubuntu1) ... 149s Processing triggers for man-db (2.13.0-1) ... 150s Processing triggers for sgml-base (1.31) ... 150s Setting up docutils-common (0.21.2+dfsg-2) ... 150s Processing triggers for sgml-base (1.31) ... 150s Setting up python3-docutils (0.21.2+dfsg-2) ... 151s Setting up python3-sphinx (8.1.3-3) ... 152s Setting up python3-sphinxcontrib.jquery (4.1-5) ... 152s Setting up python3-sphinx-rtd-theme (3.0.2+dfsg-1) ... 154s autopkgtest [02:32:06]: test pybuild-autopkgtest: pybuild-autopkgtest 154s autopkgtest [02:32:06]: test pybuild-autopkgtest: [----------------------- 154s pybuild-autopkgtest 154s I: pybuild base:311: cd /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build; python3.13 -m pytest --ignore tests/integration/test_aiohttp.py --ignore tests/integration/test_tornado.py --ignore tests/integration/test_requests.py -m "not online" -k "not test_basic_json_use and not test_load_cassette_with_custom_persister" 156s ============================= test session starts ============================== 156s platform linux -- Python 3.13.1, pytest-8.3.4, pluggy-1.5.0 156s rootdir: /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build 156s plugins: tornado-0.8.1, httpbin-2.1.0, typeguard-4.4.1 156s collected 305 items / 19 deselected / 1 skipped / 286 selected 156s 156s tests/integration/test_basic.py .... [ 1%] 156s tests/integration/test_boto3.py ss [ 2%] 156s tests/integration/test_config.py . [ 2%] 156s tests/integration/test_filter.py .......... [ 5%] 156s tests/integration/test_httplib2.py ........ [ 8%] 156s tests/integration/test_urllib2.py ........ [ 11%] 156s tests/integration/test_urllib3.py FFFFFFF [ 13%] 156s tests/integration/test_httplib2.py ........ [ 16%] 156s tests/integration/test_urllib2.py ........ [ 19%] 156s tests/integration/test_urllib3.py FFFFFFF [ 22%] 156s tests/integration/test_httplib2.py . [ 22%] 156s tests/integration/test_ignore.py .... [ 23%] 157s tests/integration/test_matchers.py .............. [ 28%] 157s tests/integration/test_multiple.py . [ 29%] 157s tests/integration/test_proxy.py F [ 29%] 157s tests/integration/test_record_mode.py ........ [ 32%] 157s tests/integration/test_register_persister.py .. [ 32%] 157s tests/integration/test_register_serializer.py . [ 33%] 157s tests/integration/test_request.py .. [ 33%] 157s tests/integration/test_stubs.py .... [ 35%] 157s tests/integration/test_urllib2.py . [ 35%] 157s tests/integration/test_urllib3.py FF. [ 36%] 157s tests/integration/test_wild.py F.F. [ 38%] 157s tests/unit/test_cassettes.py ............................... [ 48%] 157s tests/unit/test_errors.py .... [ 50%] 157s tests/unit/test_filters.py ........................ [ 58%] 157s tests/unit/test_json_serializer.py . [ 59%] 157s tests/unit/test_matchers.py ............................ [ 68%] 157s tests/unit/test_migration.py ... [ 69%] 157s tests/unit/test_persist.py .... [ 71%] 157s tests/unit/test_request.py ................. [ 77%] 157s tests/unit/test_response.py .... [ 78%] 157s tests/unit/test_serialize.py ............... [ 83%] 157s tests/unit/test_stubs.py ... [ 84%] 157s tests/unit/test_unittest.py ....... [ 87%] 157s tests/unit/test_util.py ........... [ 91%] 157s tests/unit/test_vcr.py ........................ [ 99%] 158s tests/unit/test_vcr_import.py . [100%] 158s 158s =================================== FAILURES =================================== 158s ____________________________ test_status_code[http] ____________________________ 158s 158s httpbin_both = 158s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_status_code_http_0') 158s verify_pool_mgr = 158s 158s def test_status_code(httpbin_both, tmpdir, verify_pool_mgr): 158s """Ensure that we can read the status code""" 158s url = httpbin_both.url 158s with vcr.use_cassette(str(tmpdir.join("atts.yaml"))): 158s > status_code = verify_pool_mgr.request("GET", url).status 158s 158s tests/integration/test_urllib3.py:34: 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 158s return self.request_encode_url( 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 158s return self.urlopen(method, url, **extra_kw) 158s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 158s response = conn.urlopen(method, u.request_uri, **kw) 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 158s response = self._make_request( 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s 158s self = 158s conn = 158s method = 'GET', url = '/', body = None, headers = {} 158s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 158s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 158s chunked = False, response_conn = None, preload_content = True 158s decode_content = True, enforce_content_length = True 158s 158s def _make_request( 158s self, 158s conn: BaseHTTPConnection, 158s method: str, 158s url: str, 158s body: _TYPE_BODY | None = None, 158s headers: typing.Mapping[str, str] | None = None, 158s retries: Retry | None = None, 158s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 158s chunked: bool = False, 158s response_conn: BaseHTTPConnection | None = None, 158s preload_content: bool = True, 158s decode_content: bool = True, 158s enforce_content_length: bool = True, 158s ) -> BaseHTTPResponse: 158s """ 158s Perform a request on a given urllib connection object taken from our 158s pool. 158s 158s :param conn: 158s a connection from one of our connection pools 158s 158s :param method: 158s HTTP request method (such as GET, POST, PUT, etc.) 158s 158s :param url: 158s The URL to perform the request on. 158s 158s :param body: 158s Data to send in the request body, either :class:`str`, :class:`bytes`, 158s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 158s 158s :param headers: 158s Dictionary of custom headers to send, such as User-Agent, 158s If-None-Match, etc. If None, pool headers are used. If provided, 158s these headers completely replace any pool-specific headers. 158s 158s :param retries: 158s Configure the number of retries to allow before raising a 158s :class:`~urllib3.exceptions.MaxRetryError` exception. 158s 158s Pass ``None`` to retry until you receive a response. Pass a 158s :class:`~urllib3.util.retry.Retry` object for fine-grained control 158s over different types of retries. 158s Pass an integer number to retry connection errors that many times, 158s but no other types of errors. Pass zero to never retry. 158s 158s If ``False``, then retries are disabled and any exception is raised 158s immediately. Also, instead of raising a MaxRetryError on redirects, 158s the redirect response will be returned. 158s 158s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 158s 158s :param timeout: 158s If specified, overrides the default timeout for this one 158s request. It may be a float (in seconds) or an instance of 158s :class:`urllib3.util.Timeout`. 158s 158s :param chunked: 158s If True, urllib3 will send the body using chunked transfer 158s encoding. Otherwise, urllib3 will send the body using the standard 158s content-length form. Defaults to False. 158s 158s :param response_conn: 158s Set this to ``None`` if you will handle releasing the connection or 158s set the connection to have the response release it. 158s 158s :param preload_content: 158s If True, the response's body will be preloaded during construction. 158s 158s :param decode_content: 158s If True, will attempt to decode the body based on the 158s 'content-encoding' header. 158s 158s :param enforce_content_length: 158s Enforce content length checking. Body returned by server must match 158s value of Content-Length header, if present. Otherwise, raise error. 158s """ 158s self.num_requests += 1 158s 158s timeout_obj = self._get_timeout(timeout) 158s timeout_obj.start_connect() 158s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 158s 158s try: 158s # Trigger any extra validation we need to do. 158s try: 158s self._validate_conn(conn) 158s except (SocketTimeout, BaseSSLError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 158s raise 158s 158s # _validate_conn() starts the connection to an HTTPS proxy 158s # so we need to wrap errors with 'ProxyError' here too. 158s except ( 158s OSError, 158s NewConnectionError, 158s TimeoutError, 158s BaseSSLError, 158s CertificateError, 158s SSLError, 158s ) as e: 158s new_e: Exception = e 158s if isinstance(e, (BaseSSLError, CertificateError)): 158s new_e = SSLError(e) 158s # If the connection didn't successfully connect to it's proxy 158s # then there 158s if isinstance( 158s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 158s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 158s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 158s raise new_e 158s 158s # conn.request() calls http.client.*.request, not the method in 158s # urllib3.request. It also calls makefile (recv) on the socket. 158s try: 158s conn.request( 158s method, 158s url, 158s body=body, 158s headers=headers, 158s chunked=chunked, 158s preload_content=preload_content, 158s decode_content=decode_content, 158s enforce_content_length=enforce_content_length, 158s ) 158s 158s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 158s # legitimately able to close the connection after sending a valid response. 158s # With this behaviour, the received response is still readable. 158s except BrokenPipeError: 158s pass 158s except OSError as e: 158s # MacOS/Linux 158s # EPROTOTYPE and ECONNRESET are needed on macOS 158s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 158s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 158s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 158s raise 158s 158s # Reset the timeout for the recv() on the socket 158s read_timeout = timeout_obj.read_timeout 158s 158s if not conn.is_closed: 158s # In Python 3 socket.py will catch EAGAIN and return None when you 158s # try and read into the file pointer created by http.client, which 158s # instead raises a BadStatusLine exception. Instead of catching 158s # the exception and assuming all BadStatusLine exceptions are read 158s # timeouts, check for a zero timeout before making the request. 158s if read_timeout == 0: 158s raise ReadTimeoutError( 158s self, url, f"Read timed out. (read timeout={read_timeout})" 158s ) 158s conn.timeout = read_timeout 158s 158s # Receive the response from the server 158s try: 158s response = conn.getresponse() 158s except (BaseSSLError, OSError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 158s raise 158s 158s # Set properties that are used by the pooling layer. 158s response.retries = retries 158s response._connection = response_conn # type: ignore[attr-defined] 158s response._pool = self # type: ignore[attr-defined] 158s 158s log.debug( 158s '%s://%s:%s "%s %s %s" %s %s', 158s self.scheme, 158s self.host, 158s self.port, 158s method, 158s url, 158s > response.version_string, 158s response.status, 158s response.length_remaining, 158s ) 158s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 158s 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 158s ----------------------------- Captured stderr call ----------------------------- 158s 127.0.0.1 - - [18/Jan/2025 02:32:08] "GET / HTTP/1.1" 200 9358 158s ______________________________ test_headers[http] ______________________________ 158s 158s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_headers_http_0') 158s httpbin_both = 158s verify_pool_mgr = 158s 158s def test_headers(tmpdir, httpbin_both, verify_pool_mgr): 158s """Ensure that we can read the headers back""" 158s url = httpbin_both.url 158s with vcr.use_cassette(str(tmpdir.join("headers.yaml"))): 158s > headers = verify_pool_mgr.request("GET", url).headers 158s 158s tests/integration/test_urllib3.py:44: 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 158s return self.request_encode_url( 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 158s return self.urlopen(method, url, **extra_kw) 158s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 158s response = conn.urlopen(method, u.request_uri, **kw) 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 158s response = self._make_request( 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s 158s self = 158s conn = 158s method = 'GET', url = '/', body = None, headers = {} 158s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 158s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 158s chunked = False, response_conn = None, preload_content = True 158s decode_content = True, enforce_content_length = True 158s 158s def _make_request( 158s self, 158s conn: BaseHTTPConnection, 158s method: str, 158s url: str, 158s body: _TYPE_BODY | None = None, 158s headers: typing.Mapping[str, str] | None = None, 158s retries: Retry | None = None, 158s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 158s chunked: bool = False, 158s response_conn: BaseHTTPConnection | None = None, 158s preload_content: bool = True, 158s decode_content: bool = True, 158s enforce_content_length: bool = True, 158s ) -> BaseHTTPResponse: 158s """ 158s Perform a request on a given urllib connection object taken from our 158s pool. 158s 158s :param conn: 158s a connection from one of our connection pools 158s 158s :param method: 158s HTTP request method (such as GET, POST, PUT, etc.) 158s 158s :param url: 158s The URL to perform the request on. 158s 158s :param body: 158s Data to send in the request body, either :class:`str`, :class:`bytes`, 158s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 158s 158s :param headers: 158s Dictionary of custom headers to send, such as User-Agent, 158s If-None-Match, etc. If None, pool headers are used. If provided, 158s these headers completely replace any pool-specific headers. 158s 158s :param retries: 158s Configure the number of retries to allow before raising a 158s :class:`~urllib3.exceptions.MaxRetryError` exception. 158s 158s Pass ``None`` to retry until you receive a response. Pass a 158s :class:`~urllib3.util.retry.Retry` object for fine-grained control 158s over different types of retries. 158s Pass an integer number to retry connection errors that many times, 158s but no other types of errors. Pass zero to never retry. 158s 158s If ``False``, then retries are disabled and any exception is raised 158s immediately. Also, instead of raising a MaxRetryError on redirects, 158s the redirect response will be returned. 158s 158s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 158s 158s :param timeout: 158s If specified, overrides the default timeout for this one 158s request. It may be a float (in seconds) or an instance of 158s :class:`urllib3.util.Timeout`. 158s 158s :param chunked: 158s If True, urllib3 will send the body using chunked transfer 158s encoding. Otherwise, urllib3 will send the body using the standard 158s content-length form. Defaults to False. 158s 158s :param response_conn: 158s Set this to ``None`` if you will handle releasing the connection or 158s set the connection to have the response release it. 158s 158s :param preload_content: 158s If True, the response's body will be preloaded during construction. 158s 158s :param decode_content: 158s If True, will attempt to decode the body based on the 158s 'content-encoding' header. 158s 158s :param enforce_content_length: 158s Enforce content length checking. Body returned by server must match 158s value of Content-Length header, if present. Otherwise, raise error. 158s """ 158s self.num_requests += 1 158s 158s timeout_obj = self._get_timeout(timeout) 158s timeout_obj.start_connect() 158s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 158s 158s try: 158s # Trigger any extra validation we need to do. 158s try: 158s self._validate_conn(conn) 158s except (SocketTimeout, BaseSSLError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 158s raise 158s 158s # _validate_conn() starts the connection to an HTTPS proxy 158s # so we need to wrap errors with 'ProxyError' here too. 158s except ( 158s OSError, 158s NewConnectionError, 158s TimeoutError, 158s BaseSSLError, 158s CertificateError, 158s SSLError, 158s ) as e: 158s new_e: Exception = e 158s if isinstance(e, (BaseSSLError, CertificateError)): 158s new_e = SSLError(e) 158s # If the connection didn't successfully connect to it's proxy 158s # then there 158s if isinstance( 158s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 158s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 158s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 158s raise new_e 158s 158s # conn.request() calls http.client.*.request, not the method in 158s # urllib3.request. It also calls makefile (recv) on the socket. 158s try: 158s conn.request( 158s method, 158s url, 158s body=body, 158s headers=headers, 158s chunked=chunked, 158s preload_content=preload_content, 158s decode_content=decode_content, 158s enforce_content_length=enforce_content_length, 158s ) 158s 158s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 158s # legitimately able to close the connection after sending a valid response. 158s # With this behaviour, the received response is still readable. 158s except BrokenPipeError: 158s pass 158s except OSError as e: 158s # MacOS/Linux 158s # EPROTOTYPE and ECONNRESET are needed on macOS 158s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 158s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 158s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 158s raise 158s 158s # Reset the timeout for the recv() on the socket 158s read_timeout = timeout_obj.read_timeout 158s 158s if not conn.is_closed: 158s # In Python 3 socket.py will catch EAGAIN and return None when you 158s # try and read into the file pointer created by http.client, which 158s # instead raises a BadStatusLine exception. Instead of catching 158s # the exception and assuming all BadStatusLine exceptions are read 158s # timeouts, check for a zero timeout before making the request. 158s if read_timeout == 0: 158s raise ReadTimeoutError( 158s self, url, f"Read timed out. (read timeout={read_timeout})" 158s ) 158s conn.timeout = read_timeout 158s 158s # Receive the response from the server 158s try: 158s response = conn.getresponse() 158s except (BaseSSLError, OSError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 158s raise 158s 158s # Set properties that are used by the pooling layer. 158s response.retries = retries 158s response._connection = response_conn # type: ignore[attr-defined] 158s response._pool = self # type: ignore[attr-defined] 158s 158s log.debug( 158s '%s://%s:%s "%s %s %s" %s %s', 158s self.scheme, 158s self.host, 158s self.port, 158s method, 158s url, 158s > response.version_string, 158s response.status, 158s response.length_remaining, 158s ) 158s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 158s 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 158s ----------------------------- Captured stderr call ----------------------------- 158s 127.0.0.1 - - [18/Jan/2025 02:32:08] "GET / HTTP/1.1" 200 9358 158s _______________________________ test_body[http] ________________________________ 158s 158s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_body_http_0') 158s httpbin_both = 158s verify_pool_mgr = 158s 158s def test_body(tmpdir, httpbin_both, verify_pool_mgr): 158s """Ensure the responses are all identical enough""" 158s url = httpbin_both.url + "/bytes/1024" 158s with vcr.use_cassette(str(tmpdir.join("body.yaml"))): 158s > content = verify_pool_mgr.request("GET", url).data 158s 158s tests/integration/test_urllib3.py:55: 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 158s return self.request_encode_url( 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 158s return self.urlopen(method, url, **extra_kw) 158s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 158s response = conn.urlopen(method, u.request_uri, **kw) 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 158s response = self._make_request( 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s 158s self = 158s conn = 158s method = 'GET', url = '/bytes/1024', body = None, headers = {} 158s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 158s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 158s chunked = False, response_conn = None, preload_content = True 158s decode_content = True, enforce_content_length = True 158s 158s def _make_request( 158s self, 158s conn: BaseHTTPConnection, 158s method: str, 158s url: str, 158s body: _TYPE_BODY | None = None, 158s headers: typing.Mapping[str, str] | None = None, 158s retries: Retry | None = None, 158s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 158s chunked: bool = False, 158s response_conn: BaseHTTPConnection | None = None, 158s preload_content: bool = True, 158s decode_content: bool = True, 158s enforce_content_length: bool = True, 158s ) -> BaseHTTPResponse: 158s """ 158s Perform a request on a given urllib connection object taken from our 158s pool. 158s 158s :param conn: 158s a connection from one of our connection pools 158s 158s :param method: 158s HTTP request method (such as GET, POST, PUT, etc.) 158s 158s :param url: 158s The URL to perform the request on. 158s 158s :param body: 158s Data to send in the request body, either :class:`str`, :class:`bytes`, 158s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 158s 158s :param headers: 158s Dictionary of custom headers to send, such as User-Agent, 158s If-None-Match, etc. If None, pool headers are used. If provided, 158s these headers completely replace any pool-specific headers. 158s 158s :param retries: 158s Configure the number of retries to allow before raising a 158s :class:`~urllib3.exceptions.MaxRetryError` exception. 158s 158s Pass ``None`` to retry until you receive a response. Pass a 158s :class:`~urllib3.util.retry.Retry` object for fine-grained control 158s over different types of retries. 158s Pass an integer number to retry connection errors that many times, 158s but no other types of errors. Pass zero to never retry. 158s 158s If ``False``, then retries are disabled and any exception is raised 158s immediately. Also, instead of raising a MaxRetryError on redirects, 158s the redirect response will be returned. 158s 158s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 158s 158s :param timeout: 158s If specified, overrides the default timeout for this one 158s request. It may be a float (in seconds) or an instance of 158s :class:`urllib3.util.Timeout`. 158s 158s :param chunked: 158s If True, urllib3 will send the body using chunked transfer 158s encoding. Otherwise, urllib3 will send the body using the standard 158s content-length form. Defaults to False. 158s 158s :param response_conn: 158s Set this to ``None`` if you will handle releasing the connection or 158s set the connection to have the response release it. 158s 158s :param preload_content: 158s If True, the response's body will be preloaded during construction. 158s 158s :param decode_content: 158s If True, will attempt to decode the body based on the 158s 'content-encoding' header. 158s 158s :param enforce_content_length: 158s Enforce content length checking. Body returned by server must match 158s value of Content-Length header, if present. Otherwise, raise error. 158s """ 158s self.num_requests += 1 158s 158s timeout_obj = self._get_timeout(timeout) 158s timeout_obj.start_connect() 158s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 158s 158s try: 158s # Trigger any extra validation we need to do. 158s try: 158s self._validate_conn(conn) 158s except (SocketTimeout, BaseSSLError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 158s raise 158s 158s # _validate_conn() starts the connection to an HTTPS proxy 158s # so we need to wrap errors with 'ProxyError' here too. 158s except ( 158s OSError, 158s NewConnectionError, 158s TimeoutError, 158s BaseSSLError, 158s CertificateError, 158s SSLError, 158s ) as e: 158s new_e: Exception = e 158s if isinstance(e, (BaseSSLError, CertificateError)): 158s new_e = SSLError(e) 158s # If the connection didn't successfully connect to it's proxy 158s # then there 158s if isinstance( 158s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 158s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 158s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 158s raise new_e 158s 158s # conn.request() calls http.client.*.request, not the method in 158s # urllib3.request. It also calls makefile (recv) on the socket. 158s try: 158s conn.request( 158s method, 158s url, 158s body=body, 158s headers=headers, 158s chunked=chunked, 158s preload_content=preload_content, 158s decode_content=decode_content, 158s enforce_content_length=enforce_content_length, 158s ) 158s 158s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 158s # legitimately able to close the connection after sending a valid response. 158s # With this behaviour, the received response is still readable. 158s except BrokenPipeError: 158s pass 158s except OSError as e: 158s # MacOS/Linux 158s # EPROTOTYPE and ECONNRESET are needed on macOS 158s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 158s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 158s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 158s raise 158s 158s # Reset the timeout for the recv() on the socket 158s read_timeout = timeout_obj.read_timeout 158s 158s if not conn.is_closed: 158s # In Python 3 socket.py will catch EAGAIN and return None when you 158s # try and read into the file pointer created by http.client, which 158s # instead raises a BadStatusLine exception. Instead of catching 158s # the exception and assuming all BadStatusLine exceptions are read 158s # timeouts, check for a zero timeout before making the request. 158s if read_timeout == 0: 158s raise ReadTimeoutError( 158s self, url, f"Read timed out. (read timeout={read_timeout})" 158s ) 158s conn.timeout = read_timeout 158s 158s # Receive the response from the server 158s try: 158s response = conn.getresponse() 158s except (BaseSSLError, OSError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 158s raise 158s 158s # Set properties that are used by the pooling layer. 158s response.retries = retries 158s response._connection = response_conn # type: ignore[attr-defined] 158s response._pool = self # type: ignore[attr-defined] 158s 158s log.debug( 158s '%s://%s:%s "%s %s %s" %s %s', 158s self.scheme, 158s self.host, 158s self.port, 158s method, 158s url, 158s > response.version_string, 158s response.status, 158s response.length_remaining, 158s ) 158s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 158s 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 158s ----------------------------- Captured stderr call ----------------------------- 158s 127.0.0.1 - - [18/Jan/2025 02:32:08] "GET /bytes/1024 HTTP/1.1" 200 1024 158s _______________________________ test_auth[http] ________________________________ 158s 158s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_auth_http_0') 158s httpbin_both = 158s verify_pool_mgr = 158s 158s def test_auth(tmpdir, httpbin_both, verify_pool_mgr): 158s """Ensure that we can handle basic auth""" 158s auth = ("user", "passwd") 158s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 158s url = httpbin_both.url + "/basic-auth/user/passwd" 158s with vcr.use_cassette(str(tmpdir.join("auth.yaml"))): 158s > one = verify_pool_mgr.request("GET", url, headers=headers) 158s 158s tests/integration/test_urllib3.py:67: 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 158s return self.request_encode_url( 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 158s return self.urlopen(method, url, **extra_kw) 158s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 158s response = conn.urlopen(method, u.request_uri, **kw) 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 158s response = self._make_request( 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s 158s self = 158s conn = 158s method = 'GET', url = '/basic-auth/user/passwd', body = None 158s headers = {'authorization': 'Basic dXNlcjpwYXNzd2Q='} 158s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 158s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 158s chunked = False, response_conn = None, preload_content = True 158s decode_content = True, enforce_content_length = True 158s 158s def _make_request( 158s self, 158s conn: BaseHTTPConnection, 158s method: str, 158s url: str, 158s body: _TYPE_BODY | None = None, 158s headers: typing.Mapping[str, str] | None = None, 158s retries: Retry | None = None, 158s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 158s chunked: bool = False, 158s response_conn: BaseHTTPConnection | None = None, 158s preload_content: bool = True, 158s decode_content: bool = True, 158s enforce_content_length: bool = True, 158s ) -> BaseHTTPResponse: 158s """ 158s Perform a request on a given urllib connection object taken from our 158s pool. 158s 158s :param conn: 158s a connection from one of our connection pools 158s 158s :param method: 158s HTTP request method (such as GET, POST, PUT, etc.) 158s 158s :param url: 158s The URL to perform the request on. 158s 158s :param body: 158s Data to send in the request body, either :class:`str`, :class:`bytes`, 158s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 158s 158s :param headers: 158s Dictionary of custom headers to send, such as User-Agent, 158s If-None-Match, etc. If None, pool headers are used. If provided, 158s these headers completely replace any pool-specific headers. 158s 158s :param retries: 158s Configure the number of retries to allow before raising a 158s :class:`~urllib3.exceptions.MaxRetryError` exception. 158s 158s Pass ``None`` to retry until you receive a response. Pass a 158s :class:`~urllib3.util.retry.Retry` object for fine-grained control 158s over different types of retries. 158s Pass an integer number to retry connection errors that many times, 158s but no other types of errors. Pass zero to never retry. 158s 158s If ``False``, then retries are disabled and any exception is raised 158s immediately. Also, instead of raising a MaxRetryError on redirects, 158s the redirect response will be returned. 158s 158s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 158s 158s :param timeout: 158s If specified, overrides the default timeout for this one 158s request. It may be a float (in seconds) or an instance of 158s :class:`urllib3.util.Timeout`. 158s 158s :param chunked: 158s If True, urllib3 will send the body using chunked transfer 158s encoding. Otherwise, urllib3 will send the body using the standard 158s content-length form. Defaults to False. 158s 158s :param response_conn: 158s Set this to ``None`` if you will handle releasing the connection or 158s set the connection to have the response release it. 158s 158s :param preload_content: 158s If True, the response's body will be preloaded during construction. 158s 158s :param decode_content: 158s If True, will attempt to decode the body based on the 158s 'content-encoding' header. 158s 158s :param enforce_content_length: 158s Enforce content length checking. Body returned by server must match 158s value of Content-Length header, if present. Otherwise, raise error. 158s """ 158s self.num_requests += 1 158s 158s timeout_obj = self._get_timeout(timeout) 158s timeout_obj.start_connect() 158s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 158s 158s try: 158s # Trigger any extra validation we need to do. 158s try: 158s self._validate_conn(conn) 158s except (SocketTimeout, BaseSSLError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 158s raise 158s 158s # _validate_conn() starts the connection to an HTTPS proxy 158s # so we need to wrap errors with 'ProxyError' here too. 158s except ( 158s OSError, 158s NewConnectionError, 158s TimeoutError, 158s BaseSSLError, 158s CertificateError, 158s SSLError, 158s ) as e: 158s new_e: Exception = e 158s if isinstance(e, (BaseSSLError, CertificateError)): 158s new_e = SSLError(e) 158s # If the connection didn't successfully connect to it's proxy 158s # then there 158s if isinstance( 158s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 158s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 158s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 158s raise new_e 158s 158s # conn.request() calls http.client.*.request, not the method in 158s # urllib3.request. It also calls makefile (recv) on the socket. 158s try: 158s conn.request( 158s method, 158s url, 158s body=body, 158s headers=headers, 158s chunked=chunked, 158s preload_content=preload_content, 158s decode_content=decode_content, 158s enforce_content_length=enforce_content_length, 158s ) 158s 158s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 158s # legitimately able to close the connection after sending a valid response. 158s # With this behaviour, the received response is still readable. 158s except BrokenPipeError: 158s pass 158s except OSError as e: 158s # MacOS/Linux 158s # EPROTOTYPE and ECONNRESET are needed on macOS 158s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 158s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 158s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 158s raise 158s 158s # Reset the timeout for the recv() on the socket 158s read_timeout = timeout_obj.read_timeout 158s 158s if not conn.is_closed: 158s # In Python 3 socket.py will catch EAGAIN and return None when you 158s # try and read into the file pointer created by http.client, which 158s # instead raises a BadStatusLine exception. Instead of catching 158s # the exception and assuming all BadStatusLine exceptions are read 158s # timeouts, check for a zero timeout before making the request. 158s if read_timeout == 0: 158s raise ReadTimeoutError( 158s self, url, f"Read timed out. (read timeout={read_timeout})" 158s ) 158s conn.timeout = read_timeout 158s 158s # Receive the response from the server 158s try: 158s response = conn.getresponse() 158s except (BaseSSLError, OSError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 158s raise 158s 158s # Set properties that are used by the pooling layer. 158s response.retries = retries 158s response._connection = response_conn # type: ignore[attr-defined] 158s response._pool = self # type: ignore[attr-defined] 158s 158s log.debug( 158s '%s://%s:%s "%s %s %s" %s %s', 158s self.scheme, 158s self.host, 158s self.port, 158s method, 158s url, 158s > response.version_string, 158s response.status, 158s response.length_remaining, 158s ) 158s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 158s 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 158s ----------------------------- Captured stderr call ----------------------------- 158s 127.0.0.1 - - [18/Jan/2025 02:32:08] "GET /basic-auth/user/passwd HTTP/1.1" 200 46 158s ____________________________ test_auth_failed[http] ____________________________ 158s 158s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_auth_failed_http_0') 158s httpbin_both = 158s verify_pool_mgr = 158s 158s def test_auth_failed(tmpdir, httpbin_both, verify_pool_mgr): 158s """Ensure that we can save failed auth statuses""" 158s auth = ("user", "wrongwrongwrong") 158s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 158s url = httpbin_both.url + "/basic-auth/user/passwd" 158s with vcr.use_cassette(str(tmpdir.join("auth-failed.yaml"))) as cass: 158s # Ensure that this is empty to begin with 158s assert_cassette_empty(cass) 158s > one = verify_pool_mgr.request("GET", url, headers=headers) 158s 158s tests/integration/test_urllib3.py:83: 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 158s return self.request_encode_url( 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 158s return self.urlopen(method, url, **extra_kw) 158s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 158s response = conn.urlopen(method, u.request_uri, **kw) 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 158s response = self._make_request( 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s 158s self = 158s conn = 158s method = 'GET', url = '/basic-auth/user/passwd', body = None 158s headers = {'authorization': 'Basic dXNlcjp3cm9uZ3dyb25nd3Jvbmc='} 158s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 158s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 158s chunked = False, response_conn = None, preload_content = True 158s decode_content = True, enforce_content_length = True 158s 158s def _make_request( 158s self, 158s conn: BaseHTTPConnection, 158s method: str, 158s url: str, 158s body: _TYPE_BODY | None = None, 158s headers: typing.Mapping[str, str] | None = None, 158s retries: Retry | None = None, 158s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 158s chunked: bool = False, 158s response_conn: BaseHTTPConnection | None = None, 158s preload_content: bool = True, 158s decode_content: bool = True, 158s enforce_content_length: bool = True, 158s ) -> BaseHTTPResponse: 158s """ 158s Perform a request on a given urllib connection object taken from our 158s pool. 158s 158s :param conn: 158s a connection from one of our connection pools 158s 158s :param method: 158s HTTP request method (such as GET, POST, PUT, etc.) 158s 158s :param url: 158s The URL to perform the request on. 158s 158s :param body: 158s Data to send in the request body, either :class:`str`, :class:`bytes`, 158s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 158s 158s :param headers: 158s Dictionary of custom headers to send, such as User-Agent, 158s If-None-Match, etc. If None, pool headers are used. If provided, 158s these headers completely replace any pool-specific headers. 158s 158s :param retries: 158s Configure the number of retries to allow before raising a 158s :class:`~urllib3.exceptions.MaxRetryError` exception. 158s 158s Pass ``None`` to retry until you receive a response. Pass a 158s :class:`~urllib3.util.retry.Retry` object for fine-grained control 158s over different types of retries. 158s Pass an integer number to retry connection errors that many times, 158s but no other types of errors. Pass zero to never retry. 158s 158s If ``False``, then retries are disabled and any exception is raised 158s immediately. Also, instead of raising a MaxRetryError on redirects, 158s the redirect response will be returned. 158s 158s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 158s 158s :param timeout: 158s If specified, overrides the default timeout for this one 158s request. It may be a float (in seconds) or an instance of 158s :class:`urllib3.util.Timeout`. 158s 158s :param chunked: 158s If True, urllib3 will send the body using chunked transfer 158s encoding. Otherwise, urllib3 will send the body using the standard 158s content-length form. Defaults to False. 158s 158s :param response_conn: 158s Set this to ``None`` if you will handle releasing the connection or 158s set the connection to have the response release it. 158s 158s :param preload_content: 158s If True, the response's body will be preloaded during construction. 158s 158s :param decode_content: 158s If True, will attempt to decode the body based on the 158s 'content-encoding' header. 158s 158s :param enforce_content_length: 158s Enforce content length checking. Body returned by server must match 158s value of Content-Length header, if present. Otherwise, raise error. 158s """ 158s self.num_requests += 1 158s 158s timeout_obj = self._get_timeout(timeout) 158s timeout_obj.start_connect() 158s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 158s 158s try: 158s # Trigger any extra validation we need to do. 158s try: 158s self._validate_conn(conn) 158s except (SocketTimeout, BaseSSLError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 158s raise 158s 158s # _validate_conn() starts the connection to an HTTPS proxy 158s # so we need to wrap errors with 'ProxyError' here too. 158s except ( 158s OSError, 158s NewConnectionError, 158s TimeoutError, 158s BaseSSLError, 158s CertificateError, 158s SSLError, 158s ) as e: 158s new_e: Exception = e 158s if isinstance(e, (BaseSSLError, CertificateError)): 158s new_e = SSLError(e) 158s # If the connection didn't successfully connect to it's proxy 158s # then there 158s if isinstance( 158s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 158s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 158s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 158s raise new_e 158s 158s # conn.request() calls http.client.*.request, not the method in 158s # urllib3.request. It also calls makefile (recv) on the socket. 158s try: 158s conn.request( 158s method, 158s url, 158s body=body, 158s headers=headers, 158s chunked=chunked, 158s preload_content=preload_content, 158s decode_content=decode_content, 158s enforce_content_length=enforce_content_length, 158s ) 158s 158s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 158s # legitimately able to close the connection after sending a valid response. 158s # With this behaviour, the received response is still readable. 158s except BrokenPipeError: 158s pass 158s except OSError as e: 158s # MacOS/Linux 158s # EPROTOTYPE and ECONNRESET are needed on macOS 158s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 158s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 158s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 158s raise 158s 158s # Reset the timeout for the recv() on the socket 158s read_timeout = timeout_obj.read_timeout 158s 158s if not conn.is_closed: 158s # In Python 3 socket.py will catch EAGAIN and return None when you 158s # try and read into the file pointer created by http.client, which 158s # instead raises a BadStatusLine exception. Instead of catching 158s # the exception and assuming all BadStatusLine exceptions are read 158s # timeouts, check for a zero timeout before making the request. 158s if read_timeout == 0: 158s raise ReadTimeoutError( 158s self, url, f"Read timed out. (read timeout={read_timeout})" 158s ) 158s conn.timeout = read_timeout 158s 158s # Receive the response from the server 158s try: 158s response = conn.getresponse() 158s except (BaseSSLError, OSError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 158s raise 158s 158s # Set properties that are used by the pooling layer. 158s response.retries = retries 158s response._connection = response_conn # type: ignore[attr-defined] 158s response._pool = self # type: ignore[attr-defined] 158s 158s log.debug( 158s '%s://%s:%s "%s %s %s" %s %s', 158s self.scheme, 158s self.host, 158s self.port, 158s method, 158s url, 158s > response.version_string, 158s response.status, 158s response.length_remaining, 158s ) 158s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 158s 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 158s ----------------------------- Captured stderr call ----------------------------- 158s 127.0.0.1 - - [18/Jan/2025 02:32:08] "GET /basic-auth/user/passwd HTTP/1.1" 401 0 158s _______________________________ test_post[http] ________________________________ 158s 158s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_post_http_0') 158s httpbin_both = 158s verify_pool_mgr = 158s 158s def test_post(tmpdir, httpbin_both, verify_pool_mgr): 158s """Ensure that we can post and cache the results""" 158s data = {"key1": "value1", "key2": "value2"} 158s url = httpbin_both.url + "/post" 158s with vcr.use_cassette(str(tmpdir.join("verify_pool_mgr.yaml"))): 158s > req1 = verify_pool_mgr.request("POST", url, data).data 158s 158s tests/integration/test_urllib3.py:94: 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:143: in request 158s return self.request_encode_body( 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:278: in request_encode_body 158s return self.urlopen(method, url, **extra_kw) 158s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 158s response = conn.urlopen(method, u.request_uri, **kw) 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 158s response = self._make_request( 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s 158s self = 158s conn = 158s method = 'POST', url = '/post', body = {'key1': 'value1', 'key2': 'value2'} 158s headers = HTTPHeaderDict({}) 158s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 158s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 158s chunked = False, response_conn = None, preload_content = True 158s decode_content = True, enforce_content_length = True 158s 158s def _make_request( 158s self, 158s conn: BaseHTTPConnection, 158s method: str, 158s url: str, 158s body: _TYPE_BODY | None = None, 158s headers: typing.Mapping[str, str] | None = None, 158s retries: Retry | None = None, 158s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 158s chunked: bool = False, 158s response_conn: BaseHTTPConnection | None = None, 158s preload_content: bool = True, 158s decode_content: bool = True, 158s enforce_content_length: bool = True, 158s ) -> BaseHTTPResponse: 158s """ 158s Perform a request on a given urllib connection object taken from our 158s pool. 158s 158s :param conn: 158s a connection from one of our connection pools 158s 158s :param method: 158s HTTP request method (such as GET, POST, PUT, etc.) 158s 158s :param url: 158s The URL to perform the request on. 158s 158s :param body: 158s Data to send in the request body, either :class:`str`, :class:`bytes`, 158s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 158s 158s :param headers: 158s Dictionary of custom headers to send, such as User-Agent, 158s If-None-Match, etc. If None, pool headers are used. If provided, 158s these headers completely replace any pool-specific headers. 158s 158s :param retries: 158s Configure the number of retries to allow before raising a 158s :class:`~urllib3.exceptions.MaxRetryError` exception. 158s 158s Pass ``None`` to retry until you receive a response. Pass a 158s :class:`~urllib3.util.retry.Retry` object for fine-grained control 158s over different types of retries. 158s Pass an integer number to retry connection errors that many times, 158s but no other types of errors. Pass zero to never retry. 158s 158s If ``False``, then retries are disabled and any exception is raised 158s immediately. Also, instead of raising a MaxRetryError on redirects, 158s the redirect response will be returned. 158s 158s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 158s 158s :param timeout: 158s If specified, overrides the default timeout for this one 158s request. It may be a float (in seconds) or an instance of 158s :class:`urllib3.util.Timeout`. 158s 158s :param chunked: 158s If True, urllib3 will send the body using chunked transfer 158s encoding. Otherwise, urllib3 will send the body using the standard 158s content-length form. Defaults to False. 158s 158s :param response_conn: 158s Set this to ``None`` if you will handle releasing the connection or 158s set the connection to have the response release it. 158s 158s :param preload_content: 158s If True, the response's body will be preloaded during construction. 158s 158s :param decode_content: 158s If True, will attempt to decode the body based on the 158s 'content-encoding' header. 158s 158s :param enforce_content_length: 158s Enforce content length checking. Body returned by server must match 158s value of Content-Length header, if present. Otherwise, raise error. 158s """ 158s self.num_requests += 1 158s 158s timeout_obj = self._get_timeout(timeout) 158s timeout_obj.start_connect() 158s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 158s 158s try: 158s # Trigger any extra validation we need to do. 158s try: 158s self._validate_conn(conn) 158s except (SocketTimeout, BaseSSLError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 158s raise 158s 158s # _validate_conn() starts the connection to an HTTPS proxy 158s # so we need to wrap errors with 'ProxyError' here too. 158s except ( 158s OSError, 158s NewConnectionError, 158s TimeoutError, 158s BaseSSLError, 158s CertificateError, 158s SSLError, 158s ) as e: 158s new_e: Exception = e 158s if isinstance(e, (BaseSSLError, CertificateError)): 158s new_e = SSLError(e) 158s # If the connection didn't successfully connect to it's proxy 158s # then there 158s if isinstance( 158s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 158s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 158s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 158s raise new_e 158s 158s # conn.request() calls http.client.*.request, not the method in 158s # urllib3.request. It also calls makefile (recv) on the socket. 158s try: 158s conn.request( 158s method, 158s url, 158s body=body, 158s headers=headers, 158s chunked=chunked, 158s preload_content=preload_content, 158s decode_content=decode_content, 158s enforce_content_length=enforce_content_length, 158s ) 158s 158s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 158s # legitimately able to close the connection after sending a valid response. 158s # With this behaviour, the received response is still readable. 158s except BrokenPipeError: 158s pass 158s except OSError as e: 158s # MacOS/Linux 158s # EPROTOTYPE and ECONNRESET are needed on macOS 158s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 158s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 158s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 158s raise 158s 158s # Reset the timeout for the recv() on the socket 158s read_timeout = timeout_obj.read_timeout 158s 158s if not conn.is_closed: 158s # In Python 3 socket.py will catch EAGAIN and return None when you 158s # try and read into the file pointer created by http.client, which 158s # instead raises a BadStatusLine exception. Instead of catching 158s # the exception and assuming all BadStatusLine exceptions are read 158s # timeouts, check for a zero timeout before making the request. 158s if read_timeout == 0: 158s raise ReadTimeoutError( 158s self, url, f"Read timed out. (read timeout={read_timeout})" 158s ) 158s conn.timeout = read_timeout 158s 158s # Receive the response from the server 158s try: 158s response = conn.getresponse() 158s except (BaseSSLError, OSError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 158s raise 158s 158s # Set properties that are used by the pooling layer. 158s response.retries = retries 158s response._connection = response_conn # type: ignore[attr-defined] 158s response._pool = self # type: ignore[attr-defined] 158s 158s log.debug( 158s '%s://%s:%s "%s %s %s" %s %s', 158s self.scheme, 158s self.host, 158s self.port, 158s method, 158s url, 158s > response.version_string, 158s response.status, 158s response.length_remaining, 158s ) 158s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 158s 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 158s ----------------------------- Captured stderr call ----------------------------- 158s 127.0.0.1 - - [18/Jan/2025 02:32:08] "POST /post HTTP/1.1" 501 159 158s _______________________________ test_gzip[http] ________________________________ 158s 158s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_gzip_http_0') 158s httpbin_both = 158s verify_pool_mgr = 158s 158s def test_gzip(tmpdir, httpbin_both, verify_pool_mgr): 158s """ 158s Ensure that requests (actually urllib3) is able to automatically decompress 158s the response body 158s """ 158s url = httpbin_both.url + "/gzip" 158s response = verify_pool_mgr.request("GET", url) 158s 158s with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))): 158s > response = verify_pool_mgr.request("GET", url) 158s 158s tests/integration/test_urllib3.py:140: 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 158s return self.request_encode_url( 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 158s return self.urlopen(method, url, **extra_kw) 158s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 158s response = conn.urlopen(method, u.request_uri, **kw) 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 158s response = self._make_request( 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s 158s self = 158s conn = 158s method = 'GET', url = '/gzip', body = None, headers = {} 158s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 158s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 158s chunked = False, response_conn = None, preload_content = True 158s decode_content = True, enforce_content_length = True 158s 158s def _make_request( 158s self, 158s conn: BaseHTTPConnection, 158s method: str, 158s url: str, 158s body: _TYPE_BODY | None = None, 158s headers: typing.Mapping[str, str] | None = None, 158s retries: Retry | None = None, 158s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 158s chunked: bool = False, 158s response_conn: BaseHTTPConnection | None = None, 158s preload_content: bool = True, 158s decode_content: bool = True, 158s enforce_content_length: bool = True, 158s ) -> BaseHTTPResponse: 158s """ 158s Perform a request on a given urllib connection object taken from our 158s pool. 158s 158s :param conn: 158s a connection from one of our connection pools 158s 158s :param method: 158s HTTP request method (such as GET, POST, PUT, etc.) 158s 158s :param url: 158s The URL to perform the request on. 158s 158s :param body: 158s Data to send in the request body, either :class:`str`, :class:`bytes`, 158s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 158s 158s :param headers: 158s Dictionary of custom headers to send, such as User-Agent, 158s If-None-Match, etc. If None, pool headers are used. If provided, 158s these headers completely replace any pool-specific headers. 158s 158s :param retries: 158s Configure the number of retries to allow before raising a 158s :class:`~urllib3.exceptions.MaxRetryError` exception. 158s 158s Pass ``None`` to retry until you receive a response. Pass a 158s :class:`~urllib3.util.retry.Retry` object for fine-grained control 158s over different types of retries. 158s Pass an integer number to retry connection errors that many times, 158s but no other types of errors. Pass zero to never retry. 158s 158s If ``False``, then retries are disabled and any exception is raised 158s immediately. Also, instead of raising a MaxRetryError on redirects, 158s the redirect response will be returned. 158s 158s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 158s 158s :param timeout: 158s If specified, overrides the default timeout for this one 158s request. It may be a float (in seconds) or an instance of 158s :class:`urllib3.util.Timeout`. 158s 158s :param chunked: 158s If True, urllib3 will send the body using chunked transfer 158s encoding. Otherwise, urllib3 will send the body using the standard 158s content-length form. Defaults to False. 158s 158s :param response_conn: 158s Set this to ``None`` if you will handle releasing the connection or 158s set the connection to have the response release it. 158s 158s :param preload_content: 158s If True, the response's body will be preloaded during construction. 158s 158s :param decode_content: 158s If True, will attempt to decode the body based on the 158s 'content-encoding' header. 158s 158s :param enforce_content_length: 158s Enforce content length checking. Body returned by server must match 158s value of Content-Length header, if present. Otherwise, raise error. 158s """ 158s self.num_requests += 1 158s 158s timeout_obj = self._get_timeout(timeout) 158s timeout_obj.start_connect() 158s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 158s 158s try: 158s # Trigger any extra validation we need to do. 158s try: 158s self._validate_conn(conn) 158s except (SocketTimeout, BaseSSLError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 158s raise 158s 158s # _validate_conn() starts the connection to an HTTPS proxy 158s # so we need to wrap errors with 'ProxyError' here too. 158s except ( 158s OSError, 158s NewConnectionError, 158s TimeoutError, 158s BaseSSLError, 158s CertificateError, 158s SSLError, 158s ) as e: 158s new_e: Exception = e 158s if isinstance(e, (BaseSSLError, CertificateError)): 158s new_e = SSLError(e) 158s # If the connection didn't successfully connect to it's proxy 158s # then there 158s if isinstance( 158s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 158s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 158s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 158s raise new_e 158s 158s # conn.request() calls http.client.*.request, not the method in 158s # urllib3.request. It also calls makefile (recv) on the socket. 158s try: 158s conn.request( 158s method, 158s url, 158s body=body, 158s headers=headers, 158s chunked=chunked, 158s preload_content=preload_content, 158s decode_content=decode_content, 158s enforce_content_length=enforce_content_length, 158s ) 158s 158s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 158s # legitimately able to close the connection after sending a valid response. 158s # With this behaviour, the received response is still readable. 158s except BrokenPipeError: 158s pass 158s except OSError as e: 158s # MacOS/Linux 158s # EPROTOTYPE and ECONNRESET are needed on macOS 158s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 158s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 158s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 158s raise 158s 158s # Reset the timeout for the recv() on the socket 158s read_timeout = timeout_obj.read_timeout 158s 158s if not conn.is_closed: 158s # In Python 3 socket.py will catch EAGAIN and return None when you 158s # try and read into the file pointer created by http.client, which 158s # instead raises a BadStatusLine exception. Instead of catching 158s # the exception and assuming all BadStatusLine exceptions are read 158s # timeouts, check for a zero timeout before making the request. 158s if read_timeout == 0: 158s raise ReadTimeoutError( 158s self, url, f"Read timed out. (read timeout={read_timeout})" 158s ) 158s conn.timeout = read_timeout 158s 158s # Receive the response from the server 158s try: 158s response = conn.getresponse() 158s except (BaseSSLError, OSError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 158s raise 158s 158s # Set properties that are used by the pooling layer. 158s response.retries = retries 158s response._connection = response_conn # type: ignore[attr-defined] 158s response._pool = self # type: ignore[attr-defined] 158s 158s log.debug( 158s '%s://%s:%s "%s %s %s" %s %s', 158s self.scheme, 158s self.host, 158s self.port, 158s method, 158s url, 158s > response.version_string, 158s response.status, 158s response.length_remaining, 158s ) 158s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 158s 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 158s ----------------------------- Captured stderr call ----------------------------- 158s 127.0.0.1 - - [18/Jan/2025 02:32:08] "GET /gzip HTTP/1.1" 200 164 158s 127.0.0.1 - - [18/Jan/2025 02:32:08] "GET /gzip HTTP/1.1" 200 164 158s ___________________________ test_status_code[https] ____________________________ 158s 158s httpbin_both = 158s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_status_code_https_0') 158s verify_pool_mgr = 158s 158s def test_status_code(httpbin_both, tmpdir, verify_pool_mgr): 158s """Ensure that we can read the status code""" 158s url = httpbin_both.url 158s with vcr.use_cassette(str(tmpdir.join("atts.yaml"))): 158s > status_code = verify_pool_mgr.request("GET", url).status 158s 158s tests/integration/test_urllib3.py:34: 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 158s return self.request_encode_url( 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 158s return self.urlopen(method, url, **extra_kw) 158s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 158s response = conn.urlopen(method, u.request_uri, **kw) 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 158s response = self._make_request( 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s 158s self = 158s conn = 158s method = 'GET', url = '/', body = None, headers = {} 158s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 158s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 158s chunked = False, response_conn = None, preload_content = True 158s decode_content = True, enforce_content_length = True 158s 158s def _make_request( 158s self, 158s conn: BaseHTTPConnection, 158s method: str, 158s url: str, 158s body: _TYPE_BODY | None = None, 158s headers: typing.Mapping[str, str] | None = None, 158s retries: Retry | None = None, 158s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 158s chunked: bool = False, 158s response_conn: BaseHTTPConnection | None = None, 158s preload_content: bool = True, 158s decode_content: bool = True, 158s enforce_content_length: bool = True, 158s ) -> BaseHTTPResponse: 158s """ 158s Perform a request on a given urllib connection object taken from our 158s pool. 158s 158s :param conn: 158s a connection from one of our connection pools 158s 158s :param method: 158s HTTP request method (such as GET, POST, PUT, etc.) 158s 158s :param url: 158s The URL to perform the request on. 158s 158s :param body: 158s Data to send in the request body, either :class:`str`, :class:`bytes`, 158s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 158s 158s :param headers: 158s Dictionary of custom headers to send, such as User-Agent, 158s If-None-Match, etc. If None, pool headers are used. If provided, 158s these headers completely replace any pool-specific headers. 158s 158s :param retries: 158s Configure the number of retries to allow before raising a 158s :class:`~urllib3.exceptions.MaxRetryError` exception. 158s 158s Pass ``None`` to retry until you receive a response. Pass a 158s :class:`~urllib3.util.retry.Retry` object for fine-grained control 158s over different types of retries. 158s Pass an integer number to retry connection errors that many times, 158s but no other types of errors. Pass zero to never retry. 158s 158s If ``False``, then retries are disabled and any exception is raised 158s immediately. Also, instead of raising a MaxRetryError on redirects, 158s the redirect response will be returned. 158s 158s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 158s 158s :param timeout: 158s If specified, overrides the default timeout for this one 158s request. It may be a float (in seconds) or an instance of 158s :class:`urllib3.util.Timeout`. 158s 158s :param chunked: 158s If True, urllib3 will send the body using chunked transfer 158s encoding. Otherwise, urllib3 will send the body using the standard 158s content-length form. Defaults to False. 158s 158s :param response_conn: 158s Set this to ``None`` if you will handle releasing the connection or 158s set the connection to have the response release it. 158s 158s :param preload_content: 158s If True, the response's body will be preloaded during construction. 158s 158s :param decode_content: 158s If True, will attempt to decode the body based on the 158s 'content-encoding' header. 158s 158s :param enforce_content_length: 158s Enforce content length checking. Body returned by server must match 158s value of Content-Length header, if present. Otherwise, raise error. 158s """ 158s self.num_requests += 1 158s 158s timeout_obj = self._get_timeout(timeout) 158s timeout_obj.start_connect() 158s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 158s 158s try: 158s # Trigger any extra validation we need to do. 158s try: 158s self._validate_conn(conn) 158s except (SocketTimeout, BaseSSLError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 158s raise 158s 158s # _validate_conn() starts the connection to an HTTPS proxy 158s # so we need to wrap errors with 'ProxyError' here too. 158s except ( 158s OSError, 158s NewConnectionError, 158s TimeoutError, 158s BaseSSLError, 158s CertificateError, 158s SSLError, 158s ) as e: 158s new_e: Exception = e 158s if isinstance(e, (BaseSSLError, CertificateError)): 158s new_e = SSLError(e) 158s # If the connection didn't successfully connect to it's proxy 158s # then there 158s if isinstance( 158s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 158s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 158s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 158s raise new_e 158s 158s # conn.request() calls http.client.*.request, not the method in 158s # urllib3.request. It also calls makefile (recv) on the socket. 158s try: 158s conn.request( 158s method, 158s url, 158s body=body, 158s headers=headers, 158s chunked=chunked, 158s preload_content=preload_content, 158s decode_content=decode_content, 158s enforce_content_length=enforce_content_length, 158s ) 158s 158s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 158s # legitimately able to close the connection after sending a valid response. 158s # With this behaviour, the received response is still readable. 158s except BrokenPipeError: 158s pass 158s except OSError as e: 158s # MacOS/Linux 158s # EPROTOTYPE and ECONNRESET are needed on macOS 158s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 158s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 158s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 158s raise 158s 158s # Reset the timeout for the recv() on the socket 158s read_timeout = timeout_obj.read_timeout 158s 158s if not conn.is_closed: 158s # In Python 3 socket.py will catch EAGAIN and return None when you 158s # try and read into the file pointer created by http.client, which 158s # instead raises a BadStatusLine exception. Instead of catching 158s # the exception and assuming all BadStatusLine exceptions are read 158s # timeouts, check for a zero timeout before making the request. 158s if read_timeout == 0: 158s raise ReadTimeoutError( 158s self, url, f"Read timed out. (read timeout={read_timeout})" 158s ) 158s conn.timeout = read_timeout 158s 158s # Receive the response from the server 158s try: 158s response = conn.getresponse() 158s except (BaseSSLError, OSError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 158s raise 158s 158s # Set properties that are used by the pooling layer. 158s response.retries = retries 158s response._connection = response_conn # type: ignore[attr-defined] 158s response._pool = self # type: ignore[attr-defined] 158s 158s log.debug( 158s '%s://%s:%s "%s %s %s" %s %s', 158s self.scheme, 158s self.host, 158s self.port, 158s method, 158s url, 158s > response.version_string, 158s response.status, 158s response.length_remaining, 158s ) 158s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 158s 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 158s ----------------------------- Captured stderr call ----------------------------- 158s 127.0.0.1 - - [18/Jan/2025 02:32:08] "GET / HTTP/1.1" 200 9358 158s _____________________________ test_headers[https] ______________________________ 158s 158s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_headers_https_0') 158s httpbin_both = 158s verify_pool_mgr = 158s 158s def test_headers(tmpdir, httpbin_both, verify_pool_mgr): 158s """Ensure that we can read the headers back""" 158s url = httpbin_both.url 158s with vcr.use_cassette(str(tmpdir.join("headers.yaml"))): 158s > headers = verify_pool_mgr.request("GET", url).headers 158s 158s tests/integration/test_urllib3.py:44: 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 158s return self.request_encode_url( 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 158s return self.urlopen(method, url, **extra_kw) 158s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 158s response = conn.urlopen(method, u.request_uri, **kw) 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 158s response = self._make_request( 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s 158s self = 158s conn = 158s method = 'GET', url = '/', body = None, headers = {} 158s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 158s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 158s chunked = False, response_conn = None, preload_content = True 158s decode_content = True, enforce_content_length = True 158s 158s def _make_request( 158s self, 158s conn: BaseHTTPConnection, 158s method: str, 158s url: str, 158s body: _TYPE_BODY | None = None, 158s headers: typing.Mapping[str, str] | None = None, 158s retries: Retry | None = None, 158s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 158s chunked: bool = False, 158s response_conn: BaseHTTPConnection | None = None, 158s preload_content: bool = True, 158s decode_content: bool = True, 158s enforce_content_length: bool = True, 158s ) -> BaseHTTPResponse: 158s """ 158s Perform a request on a given urllib connection object taken from our 158s pool. 158s 158s :param conn: 158s a connection from one of our connection pools 158s 158s :param method: 158s HTTP request method (such as GET, POST, PUT, etc.) 158s 158s :param url: 158s The URL to perform the request on. 158s 158s :param body: 158s Data to send in the request body, either :class:`str`, :class:`bytes`, 158s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 158s 158s :param headers: 158s Dictionary of custom headers to send, such as User-Agent, 158s If-None-Match, etc. If None, pool headers are used. If provided, 158s these headers completely replace any pool-specific headers. 158s 158s :param retries: 158s Configure the number of retries to allow before raising a 158s :class:`~urllib3.exceptions.MaxRetryError` exception. 158s 158s Pass ``None`` to retry until you receive a response. Pass a 158s :class:`~urllib3.util.retry.Retry` object for fine-grained control 158s over different types of retries. 158s Pass an integer number to retry connection errors that many times, 158s but no other types of errors. Pass zero to never retry. 158s 158s If ``False``, then retries are disabled and any exception is raised 158s immediately. Also, instead of raising a MaxRetryError on redirects, 158s the redirect response will be returned. 158s 158s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 158s 158s :param timeout: 158s If specified, overrides the default timeout for this one 158s request. It may be a float (in seconds) or an instance of 158s :class:`urllib3.util.Timeout`. 158s 158s :param chunked: 158s If True, urllib3 will send the body using chunked transfer 158s encoding. Otherwise, urllib3 will send the body using the standard 158s content-length form. Defaults to False. 158s 158s :param response_conn: 158s Set this to ``None`` if you will handle releasing the connection or 158s set the connection to have the response release it. 158s 158s :param preload_content: 158s If True, the response's body will be preloaded during construction. 158s 158s :param decode_content: 158s If True, will attempt to decode the body based on the 158s 'content-encoding' header. 158s 158s :param enforce_content_length: 158s Enforce content length checking. Body returned by server must match 158s value of Content-Length header, if present. Otherwise, raise error. 158s """ 158s self.num_requests += 1 158s 158s timeout_obj = self._get_timeout(timeout) 158s timeout_obj.start_connect() 158s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 158s 158s try: 158s # Trigger any extra validation we need to do. 158s try: 158s self._validate_conn(conn) 158s except (SocketTimeout, BaseSSLError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 158s raise 158s 158s # _validate_conn() starts the connection to an HTTPS proxy 158s # so we need to wrap errors with 'ProxyError' here too. 158s except ( 158s OSError, 158s NewConnectionError, 158s TimeoutError, 158s BaseSSLError, 158s CertificateError, 158s SSLError, 158s ) as e: 158s new_e: Exception = e 158s if isinstance(e, (BaseSSLError, CertificateError)): 158s new_e = SSLError(e) 158s # If the connection didn't successfully connect to it's proxy 158s # then there 158s if isinstance( 158s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 158s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 158s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 158s raise new_e 158s 158s # conn.request() calls http.client.*.request, not the method in 158s # urllib3.request. It also calls makefile (recv) on the socket. 158s try: 158s conn.request( 158s method, 158s url, 158s body=body, 158s headers=headers, 158s chunked=chunked, 158s preload_content=preload_content, 158s decode_content=decode_content, 158s enforce_content_length=enforce_content_length, 158s ) 158s 158s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 158s # legitimately able to close the connection after sending a valid response. 158s # With this behaviour, the received response is still readable. 158s except BrokenPipeError: 158s pass 158s except OSError as e: 158s # MacOS/Linux 158s # EPROTOTYPE and ECONNRESET are needed on macOS 158s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 158s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 158s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 158s raise 158s 158s # Reset the timeout for the recv() on the socket 158s read_timeout = timeout_obj.read_timeout 158s 158s if not conn.is_closed: 158s # In Python 3 socket.py will catch EAGAIN and return None when you 158s # try and read into the file pointer created by http.client, which 158s # instead raises a BadStatusLine exception. Instead of catching 158s # the exception and assuming all BadStatusLine exceptions are read 158s # timeouts, check for a zero timeout before making the request. 158s if read_timeout == 0: 158s raise ReadTimeoutError( 158s self, url, f"Read timed out. (read timeout={read_timeout})" 158s ) 158s conn.timeout = read_timeout 158s 158s # Receive the response from the server 158s try: 158s response = conn.getresponse() 158s except (BaseSSLError, OSError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 158s raise 158s 158s # Set properties that are used by the pooling layer. 158s response.retries = retries 158s response._connection = response_conn # type: ignore[attr-defined] 158s response._pool = self # type: ignore[attr-defined] 158s 158s log.debug( 158s '%s://%s:%s "%s %s %s" %s %s', 158s self.scheme, 158s self.host, 158s self.port, 158s method, 158s url, 158s > response.version_string, 158s response.status, 158s response.length_remaining, 158s ) 158s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 158s 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 158s ----------------------------- Captured stderr call ----------------------------- 158s 127.0.0.1 - - [18/Jan/2025 02:32:08] "GET / HTTP/1.1" 200 9358 158s _______________________________ test_body[https] _______________________________ 158s 158s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_body_https_0') 158s httpbin_both = 158s verify_pool_mgr = 158s 158s def test_body(tmpdir, httpbin_both, verify_pool_mgr): 158s """Ensure the responses are all identical enough""" 158s url = httpbin_both.url + "/bytes/1024" 158s with vcr.use_cassette(str(tmpdir.join("body.yaml"))): 158s > content = verify_pool_mgr.request("GET", url).data 158s 158s tests/integration/test_urllib3.py:55: 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 158s return self.request_encode_url( 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 158s return self.urlopen(method, url, **extra_kw) 158s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 158s response = conn.urlopen(method, u.request_uri, **kw) 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 158s response = self._make_request( 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s 158s self = 158s conn = 158s method = 'GET', url = '/bytes/1024', body = None, headers = {} 158s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 158s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 158s chunked = False, response_conn = None, preload_content = True 158s decode_content = True, enforce_content_length = True 158s 158s def _make_request( 158s self, 158s conn: BaseHTTPConnection, 158s method: str, 158s url: str, 158s body: _TYPE_BODY | None = None, 158s headers: typing.Mapping[str, str] | None = None, 158s retries: Retry | None = None, 158s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 158s chunked: bool = False, 158s response_conn: BaseHTTPConnection | None = None, 158s preload_content: bool = True, 158s decode_content: bool = True, 158s enforce_content_length: bool = True, 158s ) -> BaseHTTPResponse: 158s """ 158s Perform a request on a given urllib connection object taken from our 158s pool. 158s 158s :param conn: 158s a connection from one of our connection pools 158s 158s :param method: 158s HTTP request method (such as GET, POST, PUT, etc.) 158s 158s :param url: 158s The URL to perform the request on. 158s 158s :param body: 158s Data to send in the request body, either :class:`str`, :class:`bytes`, 158s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 158s 158s :param headers: 158s Dictionary of custom headers to send, such as User-Agent, 158s If-None-Match, etc. If None, pool headers are used. If provided, 158s these headers completely replace any pool-specific headers. 158s 158s :param retries: 158s Configure the number of retries to allow before raising a 158s :class:`~urllib3.exceptions.MaxRetryError` exception. 158s 158s Pass ``None`` to retry until you receive a response. Pass a 158s :class:`~urllib3.util.retry.Retry` object for fine-grained control 158s over different types of retries. 158s Pass an integer number to retry connection errors that many times, 158s but no other types of errors. Pass zero to never retry. 158s 158s If ``False``, then retries are disabled and any exception is raised 158s immediately. Also, instead of raising a MaxRetryError on redirects, 158s the redirect response will be returned. 158s 158s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 158s 158s :param timeout: 158s If specified, overrides the default timeout for this one 158s request. It may be a float (in seconds) or an instance of 158s :class:`urllib3.util.Timeout`. 158s 158s :param chunked: 158s If True, urllib3 will send the body using chunked transfer 158s encoding. Otherwise, urllib3 will send the body using the standard 158s content-length form. Defaults to False. 158s 158s :param response_conn: 158s Set this to ``None`` if you will handle releasing the connection or 158s set the connection to have the response release it. 158s 158s :param preload_content: 158s If True, the response's body will be preloaded during construction. 158s 158s :param decode_content: 158s If True, will attempt to decode the body based on the 158s 'content-encoding' header. 158s 158s :param enforce_content_length: 158s Enforce content length checking. Body returned by server must match 158s value of Content-Length header, if present. Otherwise, raise error. 158s """ 158s self.num_requests += 1 158s 158s timeout_obj = self._get_timeout(timeout) 158s timeout_obj.start_connect() 158s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 158s 158s try: 158s # Trigger any extra validation we need to do. 158s try: 158s self._validate_conn(conn) 158s except (SocketTimeout, BaseSSLError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 158s raise 158s 158s # _validate_conn() starts the connection to an HTTPS proxy 158s # so we need to wrap errors with 'ProxyError' here too. 158s except ( 158s OSError, 158s NewConnectionError, 158s TimeoutError, 158s BaseSSLError, 158s CertificateError, 158s SSLError, 158s ) as e: 158s new_e: Exception = e 158s if isinstance(e, (BaseSSLError, CertificateError)): 158s new_e = SSLError(e) 158s # If the connection didn't successfully connect to it's proxy 158s # then there 158s if isinstance( 158s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 158s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 158s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 158s raise new_e 158s 158s # conn.request() calls http.client.*.request, not the method in 158s # urllib3.request. It also calls makefile (recv) on the socket. 158s try: 158s conn.request( 158s method, 158s url, 158s body=body, 158s headers=headers, 158s chunked=chunked, 158s preload_content=preload_content, 158s decode_content=decode_content, 158s enforce_content_length=enforce_content_length, 158s ) 158s 158s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 158s # legitimately able to close the connection after sending a valid response. 158s # With this behaviour, the received response is still readable. 158s except BrokenPipeError: 158s pass 158s except OSError as e: 158s # MacOS/Linux 158s # EPROTOTYPE and ECONNRESET are needed on macOS 158s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 158s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 158s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 158s raise 158s 158s # Reset the timeout for the recv() on the socket 158s read_timeout = timeout_obj.read_timeout 158s 158s if not conn.is_closed: 158s # In Python 3 socket.py will catch EAGAIN and return None when you 158s # try and read into the file pointer created by http.client, which 158s # instead raises a BadStatusLine exception. Instead of catching 158s # the exception and assuming all BadStatusLine exceptions are read 158s # timeouts, check for a zero timeout before making the request. 158s if read_timeout == 0: 158s raise ReadTimeoutError( 158s self, url, f"Read timed out. (read timeout={read_timeout})" 158s ) 158s conn.timeout = read_timeout 158s 158s # Receive the response from the server 158s try: 158s response = conn.getresponse() 158s except (BaseSSLError, OSError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 158s raise 158s 158s # Set properties that are used by the pooling layer. 158s response.retries = retries 158s response._connection = response_conn # type: ignore[attr-defined] 158s response._pool = self # type: ignore[attr-defined] 158s 158s log.debug( 158s '%s://%s:%s "%s %s %s" %s %s', 158s self.scheme, 158s self.host, 158s self.port, 158s method, 158s url, 158s > response.version_string, 158s response.status, 158s response.length_remaining, 158s ) 158s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 158s 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 158s ----------------------------- Captured stderr call ----------------------------- 158s 127.0.0.1 - - [18/Jan/2025 02:32:08] "GET /bytes/1024 HTTP/1.1" 200 1024 158s _______________________________ test_auth[https] _______________________________ 158s 158s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_auth_https_0') 158s httpbin_both = 158s verify_pool_mgr = 158s 158s def test_auth(tmpdir, httpbin_both, verify_pool_mgr): 158s """Ensure that we can handle basic auth""" 158s auth = ("user", "passwd") 158s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 158s url = httpbin_both.url + "/basic-auth/user/passwd" 158s with vcr.use_cassette(str(tmpdir.join("auth.yaml"))): 158s > one = verify_pool_mgr.request("GET", url, headers=headers) 158s 158s tests/integration/test_urllib3.py:67: 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 158s return self.request_encode_url( 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 158s return self.urlopen(method, url, **extra_kw) 158s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 158s response = conn.urlopen(method, u.request_uri, **kw) 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 158s response = self._make_request( 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s 158s self = 158s conn = 158s method = 'GET', url = '/basic-auth/user/passwd', body = None 158s headers = {'authorization': 'Basic dXNlcjpwYXNzd2Q='} 158s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 158s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 158s chunked = False, response_conn = None, preload_content = True 158s decode_content = True, enforce_content_length = True 158s 158s def _make_request( 158s self, 158s conn: BaseHTTPConnection, 158s method: str, 158s url: str, 158s body: _TYPE_BODY | None = None, 158s headers: typing.Mapping[str, str] | None = None, 158s retries: Retry | None = None, 158s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 158s chunked: bool = False, 158s response_conn: BaseHTTPConnection | None = None, 158s preload_content: bool = True, 158s decode_content: bool = True, 158s enforce_content_length: bool = True, 158s ) -> BaseHTTPResponse: 158s """ 158s Perform a request on a given urllib connection object taken from our 158s pool. 158s 158s :param conn: 158s a connection from one of our connection pools 158s 158s :param method: 158s HTTP request method (such as GET, POST, PUT, etc.) 158s 158s :param url: 158s The URL to perform the request on. 158s 158s :param body: 158s Data to send in the request body, either :class:`str`, :class:`bytes`, 158s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 158s 158s :param headers: 158s Dictionary of custom headers to send, such as User-Agent, 158s If-None-Match, etc. If None, pool headers are used. If provided, 158s these headers completely replace any pool-specific headers. 158s 158s :param retries: 158s Configure the number of retries to allow before raising a 158s :class:`~urllib3.exceptions.MaxRetryError` exception. 158s 158s Pass ``None`` to retry until you receive a response. Pass a 158s :class:`~urllib3.util.retry.Retry` object for fine-grained control 158s over different types of retries. 158s Pass an integer number to retry connection errors that many times, 158s but no other types of errors. Pass zero to never retry. 158s 158s If ``False``, then retries are disabled and any exception is raised 158s immediately. Also, instead of raising a MaxRetryError on redirects, 158s the redirect response will be returned. 158s 158s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 158s 158s :param timeout: 158s If specified, overrides the default timeout for this one 158s request. It may be a float (in seconds) or an instance of 158s :class:`urllib3.util.Timeout`. 158s 158s :param chunked: 158s If True, urllib3 will send the body using chunked transfer 158s encoding. Otherwise, urllib3 will send the body using the standard 158s content-length form. Defaults to False. 158s 158s :param response_conn: 158s Set this to ``None`` if you will handle releasing the connection or 158s set the connection to have the response release it. 158s 158s :param preload_content: 158s If True, the response's body will be preloaded during construction. 158s 158s :param decode_content: 158s If True, will attempt to decode the body based on the 158s 'content-encoding' header. 158s 158s :param enforce_content_length: 158s Enforce content length checking. Body returned by server must match 158s value of Content-Length header, if present. Otherwise, raise error. 158s """ 158s self.num_requests += 1 158s 158s timeout_obj = self._get_timeout(timeout) 158s timeout_obj.start_connect() 158s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 158s 158s try: 158s # Trigger any extra validation we need to do. 158s try: 158s self._validate_conn(conn) 158s except (SocketTimeout, BaseSSLError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 158s raise 158s 158s # _validate_conn() starts the connection to an HTTPS proxy 158s # so we need to wrap errors with 'ProxyError' here too. 158s except ( 158s OSError, 158s NewConnectionError, 158s TimeoutError, 158s BaseSSLError, 158s CertificateError, 158s SSLError, 158s ) as e: 158s new_e: Exception = e 158s if isinstance(e, (BaseSSLError, CertificateError)): 158s new_e = SSLError(e) 158s # If the connection didn't successfully connect to it's proxy 158s # then there 158s if isinstance( 158s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 158s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 158s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 158s raise new_e 158s 158s # conn.request() calls http.client.*.request, not the method in 158s # urllib3.request. It also calls makefile (recv) on the socket. 158s try: 158s conn.request( 158s method, 158s url, 158s body=body, 158s headers=headers, 158s chunked=chunked, 158s preload_content=preload_content, 158s decode_content=decode_content, 158s enforce_content_length=enforce_content_length, 158s ) 158s 158s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 158s # legitimately able to close the connection after sending a valid response. 158s # With this behaviour, the received response is still readable. 158s except BrokenPipeError: 158s pass 158s except OSError as e: 158s # MacOS/Linux 158s # EPROTOTYPE and ECONNRESET are needed on macOS 158s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 158s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 158s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 158s raise 158s 158s # Reset the timeout for the recv() on the socket 158s read_timeout = timeout_obj.read_timeout 158s 158s if not conn.is_closed: 158s # In Python 3 socket.py will catch EAGAIN and return None when you 158s # try and read into the file pointer created by http.client, which 158s # instead raises a BadStatusLine exception. Instead of catching 158s # the exception and assuming all BadStatusLine exceptions are read 158s # timeouts, check for a zero timeout before making the request. 158s if read_timeout == 0: 158s raise ReadTimeoutError( 158s self, url, f"Read timed out. (read timeout={read_timeout})" 158s ) 158s conn.timeout = read_timeout 158s 158s # Receive the response from the server 158s try: 158s response = conn.getresponse() 158s except (BaseSSLError, OSError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 158s raise 158s 158s # Set properties that are used by the pooling layer. 158s response.retries = retries 158s response._connection = response_conn # type: ignore[attr-defined] 158s response._pool = self # type: ignore[attr-defined] 158s 158s log.debug( 158s '%s://%s:%s "%s %s %s" %s %s', 158s self.scheme, 158s self.host, 158s self.port, 158s method, 158s url, 158s > response.version_string, 158s response.status, 158s response.length_remaining, 158s ) 158s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 158s 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 158s ----------------------------- Captured stderr call ----------------------------- 158s 127.0.0.1 - - [18/Jan/2025 02:32:08] "GET /basic-auth/user/passwd HTTP/1.1" 200 46 158s ___________________________ test_auth_failed[https] ____________________________ 158s 158s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_auth_failed_https_0') 158s httpbin_both = 158s verify_pool_mgr = 158s 158s def test_auth_failed(tmpdir, httpbin_both, verify_pool_mgr): 158s """Ensure that we can save failed auth statuses""" 158s auth = ("user", "wrongwrongwrong") 158s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 158s url = httpbin_both.url + "/basic-auth/user/passwd" 158s with vcr.use_cassette(str(tmpdir.join("auth-failed.yaml"))) as cass: 158s # Ensure that this is empty to begin with 158s assert_cassette_empty(cass) 158s > one = verify_pool_mgr.request("GET", url, headers=headers) 158s 158s tests/integration/test_urllib3.py:83: 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 158s return self.request_encode_url( 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 158s return self.urlopen(method, url, **extra_kw) 158s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 158s response = conn.urlopen(method, u.request_uri, **kw) 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 158s response = self._make_request( 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s 158s self = 158s conn = 158s method = 'GET', url = '/basic-auth/user/passwd', body = None 158s headers = {'authorization': 'Basic dXNlcjp3cm9uZ3dyb25nd3Jvbmc='} 158s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 158s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 158s chunked = False, response_conn = None, preload_content = True 158s decode_content = True, enforce_content_length = True 158s 158s def _make_request( 158s self, 158s conn: BaseHTTPConnection, 158s method: str, 158s url: str, 158s body: _TYPE_BODY | None = None, 158s headers: typing.Mapping[str, str] | None = None, 158s retries: Retry | None = None, 158s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 158s chunked: bool = False, 158s response_conn: BaseHTTPConnection | None = None, 158s preload_content: bool = True, 158s decode_content: bool = True, 158s enforce_content_length: bool = True, 158s ) -> BaseHTTPResponse: 158s """ 158s Perform a request on a given urllib connection object taken from our 158s pool. 158s 158s :param conn: 158s a connection from one of our connection pools 158s 158s :param method: 158s HTTP request method (such as GET, POST, PUT, etc.) 158s 158s :param url: 158s The URL to perform the request on. 158s 158s :param body: 158s Data to send in the request body, either :class:`str`, :class:`bytes`, 158s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 158s 158s :param headers: 158s Dictionary of custom headers to send, such as User-Agent, 158s If-None-Match, etc. If None, pool headers are used. If provided, 158s these headers completely replace any pool-specific headers. 158s 158s :param retries: 158s Configure the number of retries to allow before raising a 158s :class:`~urllib3.exceptions.MaxRetryError` exception. 158s 158s Pass ``None`` to retry until you receive a response. Pass a 158s :class:`~urllib3.util.retry.Retry` object for fine-grained control 158s over different types of retries. 158s Pass an integer number to retry connection errors that many times, 158s but no other types of errors. Pass zero to never retry. 158s 158s If ``False``, then retries are disabled and any exception is raised 158s immediately. Also, instead of raising a MaxRetryError on redirects, 158s the redirect response will be returned. 158s 158s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 158s 158s :param timeout: 158s If specified, overrides the default timeout for this one 158s request. It may be a float (in seconds) or an instance of 158s :class:`urllib3.util.Timeout`. 158s 158s :param chunked: 158s If True, urllib3 will send the body using chunked transfer 158s encoding. Otherwise, urllib3 will send the body using the standard 158s content-length form. Defaults to False. 158s 158s :param response_conn: 158s Set this to ``None`` if you will handle releasing the connection or 158s set the connection to have the response release it. 158s 158s :param preload_content: 158s If True, the response's body will be preloaded during construction. 158s 158s :param decode_content: 158s If True, will attempt to decode the body based on the 158s 'content-encoding' header. 158s 158s :param enforce_content_length: 158s Enforce content length checking. Body returned by server must match 158s value of Content-Length header, if present. Otherwise, raise error. 158s """ 158s self.num_requests += 1 158s 158s timeout_obj = self._get_timeout(timeout) 158s timeout_obj.start_connect() 158s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 158s 158s try: 158s # Trigger any extra validation we need to do. 158s try: 158s self._validate_conn(conn) 158s except (SocketTimeout, BaseSSLError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 158s raise 158s 158s # _validate_conn() starts the connection to an HTTPS proxy 158s # so we need to wrap errors with 'ProxyError' here too. 158s except ( 158s OSError, 158s NewConnectionError, 158s TimeoutError, 158s BaseSSLError, 158s CertificateError, 158s SSLError, 158s ) as e: 158s new_e: Exception = e 158s if isinstance(e, (BaseSSLError, CertificateError)): 158s new_e = SSLError(e) 158s # If the connection didn't successfully connect to it's proxy 158s # then there 158s if isinstance( 158s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 158s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 158s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 158s raise new_e 158s 158s # conn.request() calls http.client.*.request, not the method in 158s # urllib3.request. It also calls makefile (recv) on the socket. 158s try: 158s conn.request( 158s method, 158s url, 158s body=body, 158s headers=headers, 158s chunked=chunked, 158s preload_content=preload_content, 158s decode_content=decode_content, 158s enforce_content_length=enforce_content_length, 158s ) 158s 158s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 158s # legitimately able to close the connection after sending a valid response. 158s # With this behaviour, the received response is still readable. 158s except BrokenPipeError: 158s pass 158s except OSError as e: 158s # MacOS/Linux 158s # EPROTOTYPE and ECONNRESET are needed on macOS 158s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 158s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 158s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 158s raise 158s 158s # Reset the timeout for the recv() on the socket 158s read_timeout = timeout_obj.read_timeout 158s 158s if not conn.is_closed: 158s # In Python 3 socket.py will catch EAGAIN and return None when you 158s # try and read into the file pointer created by http.client, which 158s # instead raises a BadStatusLine exception. Instead of catching 158s # the exception and assuming all BadStatusLine exceptions are read 158s # timeouts, check for a zero timeout before making the request. 158s if read_timeout == 0: 158s raise ReadTimeoutError( 158s self, url, f"Read timed out. (read timeout={read_timeout})" 158s ) 158s conn.timeout = read_timeout 158s 158s # Receive the response from the server 158s try: 158s response = conn.getresponse() 158s except (BaseSSLError, OSError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 158s raise 158s 158s # Set properties that are used by the pooling layer. 158s response.retries = retries 158s response._connection = response_conn # type: ignore[attr-defined] 158s response._pool = self # type: ignore[attr-defined] 158s 158s log.debug( 158s '%s://%s:%s "%s %s %s" %s %s', 158s self.scheme, 158s self.host, 158s self.port, 158s method, 158s url, 158s > response.version_string, 158s response.status, 158s response.length_remaining, 158s ) 158s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 158s 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 158s ----------------------------- Captured stderr call ----------------------------- 158s 127.0.0.1 - - [18/Jan/2025 02:32:08] "GET /basic-auth/user/passwd HTTP/1.1" 401 0 158s _______________________________ test_post[https] _______________________________ 158s 158s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_post_https_0') 158s httpbin_both = 158s verify_pool_mgr = 158s 158s def test_post(tmpdir, httpbin_both, verify_pool_mgr): 158s """Ensure that we can post and cache the results""" 158s data = {"key1": "value1", "key2": "value2"} 158s url = httpbin_both.url + "/post" 158s with vcr.use_cassette(str(tmpdir.join("verify_pool_mgr.yaml"))): 158s > req1 = verify_pool_mgr.request("POST", url, data).data 158s 158s tests/integration/test_urllib3.py:94: 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:143: in request 158s return self.request_encode_body( 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:278: in request_encode_body 158s return self.urlopen(method, url, **extra_kw) 158s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 158s response = conn.urlopen(method, u.request_uri, **kw) 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 158s response = self._make_request( 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s 158s self = 158s conn = 158s method = 'POST', url = '/post', body = {'key1': 'value1', 'key2': 'value2'} 158s headers = HTTPHeaderDict({}) 158s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 158s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 158s chunked = False, response_conn = None, preload_content = True 158s decode_content = True, enforce_content_length = True 158s 158s def _make_request( 158s self, 158s conn: BaseHTTPConnection, 158s method: str, 158s url: str, 158s body: _TYPE_BODY | None = None, 158s headers: typing.Mapping[str, str] | None = None, 158s retries: Retry | None = None, 158s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 158s chunked: bool = False, 158s response_conn: BaseHTTPConnection | None = None, 158s preload_content: bool = True, 158s decode_content: bool = True, 158s enforce_content_length: bool = True, 158s ) -> BaseHTTPResponse: 158s """ 158s Perform a request on a given urllib connection object taken from our 158s pool. 158s 158s :param conn: 158s a connection from one of our connection pools 158s 158s :param method: 158s HTTP request method (such as GET, POST, PUT, etc.) 158s 158s :param url: 158s The URL to perform the request on. 158s 158s :param body: 158s Data to send in the request body, either :class:`str`, :class:`bytes`, 158s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 158s 158s :param headers: 158s Dictionary of custom headers to send, such as User-Agent, 158s If-None-Match, etc. If None, pool headers are used. If provided, 158s these headers completely replace any pool-specific headers. 158s 158s :param retries: 158s Configure the number of retries to allow before raising a 158s :class:`~urllib3.exceptions.MaxRetryError` exception. 158s 158s Pass ``None`` to retry until you receive a response. Pass a 158s :class:`~urllib3.util.retry.Retry` object for fine-grained control 158s over different types of retries. 158s Pass an integer number to retry connection errors that many times, 158s but no other types of errors. Pass zero to never retry. 158s 158s If ``False``, then retries are disabled and any exception is raised 158s immediately. Also, instead of raising a MaxRetryError on redirects, 158s the redirect response will be returned. 158s 158s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 158s 158s :param timeout: 158s If specified, overrides the default timeout for this one 158s request. It may be a float (in seconds) or an instance of 158s :class:`urllib3.util.Timeout`. 158s 158s :param chunked: 158s If True, urllib3 will send the body using chunked transfer 158s encoding. Otherwise, urllib3 will send the body using the standard 158s content-length form. Defaults to False. 158s 158s :param response_conn: 158s Set this to ``None`` if you will handle releasing the connection or 158s set the connection to have the response release it. 158s 158s :param preload_content: 158s If True, the response's body will be preloaded during construction. 158s 158s :param decode_content: 158s If True, will attempt to decode the body based on the 158s 'content-encoding' header. 158s 158s :param enforce_content_length: 158s Enforce content length checking. Body returned by server must match 158s value of Content-Length header, if present. Otherwise, raise error. 158s """ 158s self.num_requests += 1 158s 158s timeout_obj = self._get_timeout(timeout) 158s timeout_obj.start_connect() 158s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 158s 158s try: 158s # Trigger any extra validation we need to do. 158s try: 158s self._validate_conn(conn) 158s except (SocketTimeout, BaseSSLError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 158s raise 158s 158s # _validate_conn() starts the connection to an HTTPS proxy 158s # so we need to wrap errors with 'ProxyError' here too. 158s except ( 158s OSError, 158s NewConnectionError, 158s TimeoutError, 158s BaseSSLError, 158s CertificateError, 158s SSLError, 158s ) as e: 158s new_e: Exception = e 158s if isinstance(e, (BaseSSLError, CertificateError)): 158s new_e = SSLError(e) 158s # If the connection didn't successfully connect to it's proxy 158s # then there 158s if isinstance( 158s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 158s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 158s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 158s raise new_e 158s 158s # conn.request() calls http.client.*.request, not the method in 158s # urllib3.request. It also calls makefile (recv) on the socket. 158s try: 158s conn.request( 158s method, 158s url, 158s body=body, 158s headers=headers, 158s chunked=chunked, 158s preload_content=preload_content, 158s decode_content=decode_content, 158s enforce_content_length=enforce_content_length, 158s ) 158s 158s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 158s # legitimately able to close the connection after sending a valid response. 158s # With this behaviour, the received response is still readable. 158s except BrokenPipeError: 158s pass 158s except OSError as e: 158s # MacOS/Linux 158s # EPROTOTYPE and ECONNRESET are needed on macOS 158s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 158s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 158s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 158s raise 158s 158s # Reset the timeout for the recv() on the socket 158s read_timeout = timeout_obj.read_timeout 158s 158s if not conn.is_closed: 158s # In Python 3 socket.py will catch EAGAIN and return None when you 158s # try and read into the file pointer created by http.client, which 158s # instead raises a BadStatusLine exception. Instead of catching 158s # the exception and assuming all BadStatusLine exceptions are read 158s # timeouts, check for a zero timeout before making the request. 158s if read_timeout == 0: 158s raise ReadTimeoutError( 158s self, url, f"Read timed out. (read timeout={read_timeout})" 158s ) 158s conn.timeout = read_timeout 158s 158s # Receive the response from the server 158s try: 158s response = conn.getresponse() 158s except (BaseSSLError, OSError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 158s raise 158s 158s # Set properties that are used by the pooling layer. 158s response.retries = retries 158s response._connection = response_conn # type: ignore[attr-defined] 158s response._pool = self # type: ignore[attr-defined] 158s 158s log.debug( 158s '%s://%s:%s "%s %s %s" %s %s', 158s self.scheme, 158s self.host, 158s self.port, 158s method, 158s url, 158s > response.version_string, 158s response.status, 158s response.length_remaining, 158s ) 158s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 158s 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 158s ----------------------------- Captured stderr call ----------------------------- 158s 127.0.0.1 - - [18/Jan/2025 02:32:08] "POST /post HTTP/1.1" 501 159 158s _______________________________ test_gzip[https] _______________________________ 158s 158s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_gzip_https_0') 158s httpbin_both = 158s verify_pool_mgr = 158s 158s def test_gzip(tmpdir, httpbin_both, verify_pool_mgr): 158s """ 158s Ensure that requests (actually urllib3) is able to automatically decompress 158s the response body 158s """ 158s url = httpbin_both.url + "/gzip" 158s response = verify_pool_mgr.request("GET", url) 158s 158s with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))): 158s > response = verify_pool_mgr.request("GET", url) 158s 158s tests/integration/test_urllib3.py:140: 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 158s return self.request_encode_url( 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 158s return self.urlopen(method, url, **extra_kw) 158s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 158s response = conn.urlopen(method, u.request_uri, **kw) 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 158s response = self._make_request( 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s 158s self = 158s conn = 158s method = 'GET', url = '/gzip', body = None, headers = {} 158s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 158s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 158s chunked = False, response_conn = None, preload_content = True 158s decode_content = True, enforce_content_length = True 158s 158s def _make_request( 158s self, 158s conn: BaseHTTPConnection, 158s method: str, 158s url: str, 158s body: _TYPE_BODY | None = None, 158s headers: typing.Mapping[str, str] | None = None, 158s retries: Retry | None = None, 158s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 158s chunked: bool = False, 158s response_conn: BaseHTTPConnection | None = None, 158s preload_content: bool = True, 158s decode_content: bool = True, 158s enforce_content_length: bool = True, 158s ) -> BaseHTTPResponse: 158s """ 158s Perform a request on a given urllib connection object taken from our 158s pool. 158s 158s :param conn: 158s a connection from one of our connection pools 158s 158s :param method: 158s HTTP request method (such as GET, POST, PUT, etc.) 158s 158s :param url: 158s The URL to perform the request on. 158s 158s :param body: 158s Data to send in the request body, either :class:`str`, :class:`bytes`, 158s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 158s 158s :param headers: 158s Dictionary of custom headers to send, such as User-Agent, 158s If-None-Match, etc. If None, pool headers are used. If provided, 158s these headers completely replace any pool-specific headers. 158s 158s :param retries: 158s Configure the number of retries to allow before raising a 158s :class:`~urllib3.exceptions.MaxRetryError` exception. 158s 158s Pass ``None`` to retry until you receive a response. Pass a 158s :class:`~urllib3.util.retry.Retry` object for fine-grained control 158s over different types of retries. 158s Pass an integer number to retry connection errors that many times, 158s but no other types of errors. Pass zero to never retry. 158s 158s If ``False``, then retries are disabled and any exception is raised 158s immediately. Also, instead of raising a MaxRetryError on redirects, 158s the redirect response will be returned. 158s 158s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 158s 158s :param timeout: 158s If specified, overrides the default timeout for this one 158s request. It may be a float (in seconds) or an instance of 158s :class:`urllib3.util.Timeout`. 158s 158s :param chunked: 158s If True, urllib3 will send the body using chunked transfer 158s encoding. Otherwise, urllib3 will send the body using the standard 158s content-length form. Defaults to False. 158s 158s :param response_conn: 158s Set this to ``None`` if you will handle releasing the connection or 158s set the connection to have the response release it. 158s 158s :param preload_content: 158s If True, the response's body will be preloaded during construction. 158s 158s :param decode_content: 158s If True, will attempt to decode the body based on the 158s 'content-encoding' header. 158s 158s :param enforce_content_length: 158s Enforce content length checking. Body returned by server must match 158s value of Content-Length header, if present. Otherwise, raise error. 158s """ 158s self.num_requests += 1 158s 158s timeout_obj = self._get_timeout(timeout) 158s timeout_obj.start_connect() 158s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 158s 158s try: 158s # Trigger any extra validation we need to do. 158s try: 158s self._validate_conn(conn) 158s except (SocketTimeout, BaseSSLError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 158s raise 158s 158s # _validate_conn() starts the connection to an HTTPS proxy 158s # so we need to wrap errors with 'ProxyError' here too. 158s except ( 158s OSError, 158s NewConnectionError, 158s TimeoutError, 158s BaseSSLError, 158s CertificateError, 158s SSLError, 158s ) as e: 158s new_e: Exception = e 158s if isinstance(e, (BaseSSLError, CertificateError)): 158s new_e = SSLError(e) 158s # If the connection didn't successfully connect to it's proxy 158s # then there 158s if isinstance( 158s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 158s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 158s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 158s raise new_e 158s 158s # conn.request() calls http.client.*.request, not the method in 158s # urllib3.request. It also calls makefile (recv) on the socket. 158s try: 158s conn.request( 158s method, 158s url, 158s body=body, 158s headers=headers, 158s chunked=chunked, 158s preload_content=preload_content, 158s decode_content=decode_content, 158s enforce_content_length=enforce_content_length, 158s ) 158s 158s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 158s # legitimately able to close the connection after sending a valid response. 158s # With this behaviour, the received response is still readable. 158s except BrokenPipeError: 158s pass 158s except OSError as e: 158s # MacOS/Linux 158s # EPROTOTYPE and ECONNRESET are needed on macOS 158s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 158s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 158s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 158s raise 158s 158s # Reset the timeout for the recv() on the socket 158s read_timeout = timeout_obj.read_timeout 158s 158s if not conn.is_closed: 158s # In Python 3 socket.py will catch EAGAIN and return None when you 158s # try and read into the file pointer created by http.client, which 158s # instead raises a BadStatusLine exception. Instead of catching 158s # the exception and assuming all BadStatusLine exceptions are read 158s # timeouts, check for a zero timeout before making the request. 158s if read_timeout == 0: 158s raise ReadTimeoutError( 158s self, url, f"Read timed out. (read timeout={read_timeout})" 158s ) 158s conn.timeout = read_timeout 158s 158s # Receive the response from the server 158s try: 158s response = conn.getresponse() 158s except (BaseSSLError, OSError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 158s raise 158s 158s # Set properties that are used by the pooling layer. 158s response.retries = retries 158s response._connection = response_conn # type: ignore[attr-defined] 158s response._pool = self # type: ignore[attr-defined] 158s 158s log.debug( 158s '%s://%s:%s "%s %s %s" %s %s', 158s self.scheme, 158s self.host, 158s self.port, 158s method, 158s url, 158s > response.version_string, 158s response.status, 158s response.length_remaining, 158s ) 158s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 158s 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 158s ----------------------------- Captured stderr call ----------------------------- 158s 127.0.0.1 - - [18/Jan/2025 02:32:08] "GET /gzip HTTP/1.1" 200 165 158s 127.0.0.1 - - [18/Jan/2025 02:32:08] "GET /gzip HTTP/1.1" 200 165 158s ________________________________ test_use_proxy ________________________________ 158s 158s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_use_proxy0') 158s httpbin = 158s proxy_server = 'http://0.0.0.0:56633' 158s 158s def test_use_proxy(tmpdir, httpbin, proxy_server): 158s """Ensure that it works with a proxy.""" 158s with vcr.use_cassette(str(tmpdir.join("proxy.yaml"))): 158s > response = requests.get(httpbin.url, proxies={"http": proxy_server}) 158s 158s tests/integration/test_proxy.py:53: 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s /usr/lib/python3/dist-packages/requests/api.py:73: in get 158s return request("get", url, params=params, **kwargs) 158s /usr/lib/python3/dist-packages/requests/api.py:59: in request 158s return session.request(method=method, url=url, **kwargs) 158s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 158s resp = self.send(prep, **send_kwargs) 158s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 158s r = adapter.send(request, **kwargs) 158s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 158s resp = conn.urlopen( 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 158s response = self._make_request( 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s 158s self = 158s conn = 158s method = 'GET', url = 'http://127.0.0.1:35353/', body = None 158s headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 158s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 158s timeout = Timeout(connect=None, read=None, total=None), chunked = False 158s response_conn = 158s preload_content = False, decode_content = False, enforce_content_length = True 158s 158s def _make_request( 158s self, 158s conn: BaseHTTPConnection, 158s method: str, 158s url: str, 158s body: _TYPE_BODY | None = None, 158s headers: typing.Mapping[str, str] | None = None, 158s retries: Retry | None = None, 158s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 158s chunked: bool = False, 158s response_conn: BaseHTTPConnection | None = None, 158s preload_content: bool = True, 158s decode_content: bool = True, 158s enforce_content_length: bool = True, 158s ) -> BaseHTTPResponse: 158s """ 158s Perform a request on a given urllib connection object taken from our 158s pool. 158s 158s :param conn: 158s a connection from one of our connection pools 158s 158s :param method: 158s HTTP request method (such as GET, POST, PUT, etc.) 158s 158s :param url: 158s The URL to perform the request on. 158s 158s :param body: 158s Data to send in the request body, either :class:`str`, :class:`bytes`, 158s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 158s 158s :param headers: 158s Dictionary of custom headers to send, such as User-Agent, 158s If-None-Match, etc. If None, pool headers are used. If provided, 158s these headers completely replace any pool-specific headers. 158s 158s :param retries: 158s Configure the number of retries to allow before raising a 158s :class:`~urllib3.exceptions.MaxRetryError` exception. 158s 158s Pass ``None`` to retry until you receive a response. Pass a 158s :class:`~urllib3.util.retry.Retry` object for fine-grained control 158s over different types of retries. 158s Pass an integer number to retry connection errors that many times, 158s but no other types of errors. Pass zero to never retry. 158s 158s If ``False``, then retries are disabled and any exception is raised 158s immediately. Also, instead of raising a MaxRetryError on redirects, 158s the redirect response will be returned. 158s 158s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 158s 158s :param timeout: 158s If specified, overrides the default timeout for this one 158s request. It may be a float (in seconds) or an instance of 158s :class:`urllib3.util.Timeout`. 158s 158s :param chunked: 158s If True, urllib3 will send the body using chunked transfer 158s encoding. Otherwise, urllib3 will send the body using the standard 158s content-length form. Defaults to False. 158s 158s :param response_conn: 158s Set this to ``None`` if you will handle releasing the connection or 158s set the connection to have the response release it. 158s 158s :param preload_content: 158s If True, the response's body will be preloaded during construction. 158s 158s :param decode_content: 158s If True, will attempt to decode the body based on the 158s 'content-encoding' header. 158s 158s :param enforce_content_length: 158s Enforce content length checking. Body returned by server must match 158s value of Content-Length header, if present. Otherwise, raise error. 158s """ 158s self.num_requests += 1 158s 158s timeout_obj = self._get_timeout(timeout) 158s timeout_obj.start_connect() 158s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 158s 158s try: 158s # Trigger any extra validation we need to do. 158s try: 158s self._validate_conn(conn) 158s except (SocketTimeout, BaseSSLError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 158s raise 158s 158s # _validate_conn() starts the connection to an HTTPS proxy 158s # so we need to wrap errors with 'ProxyError' here too. 158s except ( 158s OSError, 158s NewConnectionError, 158s TimeoutError, 158s BaseSSLError, 158s CertificateError, 158s SSLError, 158s ) as e: 158s new_e: Exception = e 158s if isinstance(e, (BaseSSLError, CertificateError)): 158s new_e = SSLError(e) 158s # If the connection didn't successfully connect to it's proxy 158s # then there 158s if isinstance( 158s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 158s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 158s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 158s raise new_e 158s 158s # conn.request() calls http.client.*.request, not the method in 158s # urllib3.request. It also calls makefile (recv) on the socket. 158s try: 158s conn.request( 158s method, 158s url, 158s body=body, 158s headers=headers, 158s chunked=chunked, 158s preload_content=preload_content, 158s decode_content=decode_content, 158s enforce_content_length=enforce_content_length, 158s ) 158s 158s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 158s # legitimately able to close the connection after sending a valid response. 158s # With this behaviour, the received response is still readable. 158s except BrokenPipeError: 158s pass 158s except OSError as e: 158s # MacOS/Linux 158s # EPROTOTYPE and ECONNRESET are needed on macOS 158s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 158s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 158s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 158s raise 158s 158s # Reset the timeout for the recv() on the socket 158s read_timeout = timeout_obj.read_timeout 158s 158s if not conn.is_closed: 158s # In Python 3 socket.py will catch EAGAIN and return None when you 158s # try and read into the file pointer created by http.client, which 158s # instead raises a BadStatusLine exception. Instead of catching 158s # the exception and assuming all BadStatusLine exceptions are read 158s # timeouts, check for a zero timeout before making the request. 158s if read_timeout == 0: 158s raise ReadTimeoutError( 158s self, url, f"Read timed out. (read timeout={read_timeout})" 158s ) 158s conn.timeout = read_timeout 158s 158s # Receive the response from the server 158s try: 158s response = conn.getresponse() 158s except (BaseSSLError, OSError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 158s raise 158s 158s # Set properties that are used by the pooling layer. 158s response.retries = retries 158s response._connection = response_conn # type: ignore[attr-defined] 158s response._pool = self # type: ignore[attr-defined] 158s 158s log.debug( 158s '%s://%s:%s "%s %s %s" %s %s', 158s self.scheme, 158s self.host, 158s self.port, 158s method, 158s url, 158s > response.version_string, 158s response.status, 158s response.length_remaining, 158s ) 158s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 158s 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 158s ----------------------------- Captured stderr call ----------------------------- 158s 127.0.0.1 - - [18/Jan/2025 02:32:09] "GET / HTTP/1.1" 200 9358 158s 127.0.0.1 - - [18/Jan/2025 02:32:09] "GET http://127.0.0.1:35353/ HTTP/1.1" 200 - 158s ______________________________ test_cross_scheme _______________________________ 158s 158s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_cross_scheme2') 158s httpbin = 158s httpbin_secure = 158s verify_pool_mgr = 158s 158s def test_cross_scheme(tmpdir, httpbin, httpbin_secure, verify_pool_mgr): 158s """Ensure that requests between schemes are treated separately""" 158s # First fetch a url under http, and then again under https and then 158s # ensure that we haven't served anything out of cache, and we have two 158s # requests / response pairs in the cassette 158s with vcr.use_cassette(str(tmpdir.join("cross_scheme.yaml"))) as cass: 158s > verify_pool_mgr.request("GET", httpbin_secure.url) 158s 158s tests/integration/test_urllib3.py:125: 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 158s return self.request_encode_url( 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 158s return self.urlopen(method, url, **extra_kw) 158s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 158s response = conn.urlopen(method, u.request_uri, **kw) 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 158s response = self._make_request( 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s 158s self = 158s conn = 158s method = 'GET', url = '/', body = None, headers = {} 158s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 158s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 158s chunked = False, response_conn = None, preload_content = True 158s decode_content = True, enforce_content_length = True 158s 158s def _make_request( 158s self, 158s conn: BaseHTTPConnection, 158s method: str, 158s url: str, 158s body: _TYPE_BODY | None = None, 158s headers: typing.Mapping[str, str] | None = None, 158s retries: Retry | None = None, 158s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 158s chunked: bool = False, 158s response_conn: BaseHTTPConnection | None = None, 158s preload_content: bool = True, 158s decode_content: bool = True, 158s enforce_content_length: bool = True, 158s ) -> BaseHTTPResponse: 158s """ 158s Perform a request on a given urllib connection object taken from our 158s pool. 158s 158s :param conn: 158s a connection from one of our connection pools 158s 158s :param method: 158s HTTP request method (such as GET, POST, PUT, etc.) 158s 158s :param url: 158s The URL to perform the request on. 158s 158s :param body: 158s Data to send in the request body, either :class:`str`, :class:`bytes`, 158s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 158s 158s :param headers: 158s Dictionary of custom headers to send, such as User-Agent, 158s If-None-Match, etc. If None, pool headers are used. If provided, 158s these headers completely replace any pool-specific headers. 158s 158s :param retries: 158s Configure the number of retries to allow before raising a 158s :class:`~urllib3.exceptions.MaxRetryError` exception. 158s 158s Pass ``None`` to retry until you receive a response. Pass a 158s :class:`~urllib3.util.retry.Retry` object for fine-grained control 158s over different types of retries. 158s Pass an integer number to retry connection errors that many times, 158s but no other types of errors. Pass zero to never retry. 158s 158s If ``False``, then retries are disabled and any exception is raised 158s immediately. Also, instead of raising a MaxRetryError on redirects, 158s the redirect response will be returned. 158s 158s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 158s 158s :param timeout: 158s If specified, overrides the default timeout for this one 158s request. It may be a float (in seconds) or an instance of 158s :class:`urllib3.util.Timeout`. 158s 158s :param chunked: 158s If True, urllib3 will send the body using chunked transfer 158s encoding. Otherwise, urllib3 will send the body using the standard 158s content-length form. Defaults to False. 158s 158s :param response_conn: 158s Set this to ``None`` if you will handle releasing the connection or 158s set the connection to have the response release it. 158s 158s :param preload_content: 158s If True, the response's body will be preloaded during construction. 158s 158s :param decode_content: 158s If True, will attempt to decode the body based on the 158s 'content-encoding' header. 158s 158s :param enforce_content_length: 158s Enforce content length checking. Body returned by server must match 158s value of Content-Length header, if present. Otherwise, raise error. 158s """ 158s self.num_requests += 1 158s 158s timeout_obj = self._get_timeout(timeout) 158s timeout_obj.start_connect() 158s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 158s 158s try: 158s # Trigger any extra validation we need to do. 158s try: 158s self._validate_conn(conn) 158s except (SocketTimeout, BaseSSLError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 158s raise 158s 158s # _validate_conn() starts the connection to an HTTPS proxy 158s # so we need to wrap errors with 'ProxyError' here too. 158s except ( 158s OSError, 158s NewConnectionError, 158s TimeoutError, 158s BaseSSLError, 158s CertificateError, 158s SSLError, 158s ) as e: 158s new_e: Exception = e 158s if isinstance(e, (BaseSSLError, CertificateError)): 158s new_e = SSLError(e) 158s # If the connection didn't successfully connect to it's proxy 158s # then there 158s if isinstance( 158s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 158s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 158s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 158s raise new_e 158s 158s # conn.request() calls http.client.*.request, not the method in 158s # urllib3.request. It also calls makefile (recv) on the socket. 158s try: 158s conn.request( 158s method, 158s url, 158s body=body, 158s headers=headers, 158s chunked=chunked, 158s preload_content=preload_content, 158s decode_content=decode_content, 158s enforce_content_length=enforce_content_length, 158s ) 158s 158s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 158s # legitimately able to close the connection after sending a valid response. 158s # With this behaviour, the received response is still readable. 158s except BrokenPipeError: 158s pass 158s except OSError as e: 158s # MacOS/Linux 158s # EPROTOTYPE and ECONNRESET are needed on macOS 158s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 158s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 158s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 158s raise 158s 158s # Reset the timeout for the recv() on the socket 158s read_timeout = timeout_obj.read_timeout 158s 158s if not conn.is_closed: 158s # In Python 3 socket.py will catch EAGAIN and return None when you 158s # try and read into the file pointer created by http.client, which 158s # instead raises a BadStatusLine exception. Instead of catching 158s # the exception and assuming all BadStatusLine exceptions are read 158s # timeouts, check for a zero timeout before making the request. 158s if read_timeout == 0: 158s raise ReadTimeoutError( 158s self, url, f"Read timed out. (read timeout={read_timeout})" 158s ) 158s conn.timeout = read_timeout 158s 158s # Receive the response from the server 158s try: 158s response = conn.getresponse() 158s except (BaseSSLError, OSError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 158s raise 158s 158s # Set properties that are used by the pooling layer. 158s response.retries = retries 158s response._connection = response_conn # type: ignore[attr-defined] 158s response._pool = self # type: ignore[attr-defined] 158s 158s log.debug( 158s '%s://%s:%s "%s %s %s" %s %s', 158s self.scheme, 158s self.host, 158s self.port, 158s method, 158s url, 158s > response.version_string, 158s response.status, 158s response.length_remaining, 158s ) 158s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 158s 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 158s ----------------------------- Captured stderr call ----------------------------- 158s 127.0.0.1 - - [18/Jan/2025 02:32:09] "GET / HTTP/1.1" 200 9358 158s ___________________ test_https_with_cert_validation_disabled ___________________ 158s 158s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_https_with_cert_validatio0') 158s httpbin_secure = 158s pool_mgr = 158s 158s def test_https_with_cert_validation_disabled(tmpdir, httpbin_secure, pool_mgr): 158s with vcr.use_cassette(str(tmpdir.join("cert_validation_disabled.yaml"))): 158s > pool_mgr.request("GET", httpbin_secure.url) 158s 158s tests/integration/test_urllib3.py:149: 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 158s return self.request_encode_url( 158s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 158s return self.urlopen(method, url, **extra_kw) 158s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 158s response = conn.urlopen(method, u.request_uri, **kw) 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 158s response = self._make_request( 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s 158s self = 158s conn = 158s method = 'GET', url = '/', body = None, headers = {} 158s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 158s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 158s chunked = False, response_conn = None, preload_content = True 158s decode_content = True, enforce_content_length = True 158s 158s def _make_request( 158s self, 158s conn: BaseHTTPConnection, 158s method: str, 158s url: str, 158s body: _TYPE_BODY | None = None, 158s headers: typing.Mapping[str, str] | None = None, 158s retries: Retry | None = None, 158s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 158s chunked: bool = False, 158s response_conn: BaseHTTPConnection | None = None, 158s preload_content: bool = True, 158s decode_content: bool = True, 158s enforce_content_length: bool = True, 158s ) -> BaseHTTPResponse: 158s """ 158s Perform a request on a given urllib connection object taken from our 158s pool. 158s 158s :param conn: 158s a connection from one of our connection pools 158s 158s :param method: 158s HTTP request method (such as GET, POST, PUT, etc.) 158s 158s :param url: 158s The URL to perform the request on. 158s 158s :param body: 158s Data to send in the request body, either :class:`str`, :class:`bytes`, 158s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 158s 158s :param headers: 158s Dictionary of custom headers to send, such as User-Agent, 158s If-None-Match, etc. If None, pool headers are used. If provided, 158s these headers completely replace any pool-specific headers. 158s 158s :param retries: 158s Configure the number of retries to allow before raising a 158s :class:`~urllib3.exceptions.MaxRetryError` exception. 158s 158s Pass ``None`` to retry until you receive a response. Pass a 158s :class:`~urllib3.util.retry.Retry` object for fine-grained control 158s over different types of retries. 158s Pass an integer number to retry connection errors that many times, 158s but no other types of errors. Pass zero to never retry. 158s 158s If ``False``, then retries are disabled and any exception is raised 158s immediately. Also, instead of raising a MaxRetryError on redirects, 158s the redirect response will be returned. 158s 158s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 158s 158s :param timeout: 158s If specified, overrides the default timeout for this one 158s request. It may be a float (in seconds) or an instance of 158s :class:`urllib3.util.Timeout`. 158s 158s :param chunked: 158s If True, urllib3 will send the body using chunked transfer 158s encoding. Otherwise, urllib3 will send the body using the standard 158s content-length form. Defaults to False. 158s 158s :param response_conn: 158s Set this to ``None`` if you will handle releasing the connection or 158s set the connection to have the response release it. 158s 158s :param preload_content: 158s If True, the response's body will be preloaded during construction. 158s 158s :param decode_content: 158s If True, will attempt to decode the body based on the 158s 'content-encoding' header. 158s 158s :param enforce_content_length: 158s Enforce content length checking. Body returned by server must match 158s value of Content-Length header, if present. Otherwise, raise error. 158s """ 158s self.num_requests += 1 158s 158s timeout_obj = self._get_timeout(timeout) 158s timeout_obj.start_connect() 158s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 158s 158s try: 158s # Trigger any extra validation we need to do. 158s try: 158s self._validate_conn(conn) 158s except (SocketTimeout, BaseSSLError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 158s raise 158s 158s # _validate_conn() starts the connection to an HTTPS proxy 158s # so we need to wrap errors with 'ProxyError' here too. 158s except ( 158s OSError, 158s NewConnectionError, 158s TimeoutError, 158s BaseSSLError, 158s CertificateError, 158s SSLError, 158s ) as e: 158s new_e: Exception = e 158s if isinstance(e, (BaseSSLError, CertificateError)): 158s new_e = SSLError(e) 158s # If the connection didn't successfully connect to it's proxy 158s # then there 158s if isinstance( 158s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 158s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 158s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 158s raise new_e 158s 158s # conn.request() calls http.client.*.request, not the method in 158s # urllib3.request. It also calls makefile (recv) on the socket. 158s try: 158s conn.request( 158s method, 158s url, 158s body=body, 158s headers=headers, 158s chunked=chunked, 158s preload_content=preload_content, 158s decode_content=decode_content, 158s enforce_content_length=enforce_content_length, 158s ) 158s 158s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 158s # legitimately able to close the connection after sending a valid response. 158s # With this behaviour, the received response is still readable. 158s except BrokenPipeError: 158s pass 158s except OSError as e: 158s # MacOS/Linux 158s # EPROTOTYPE and ECONNRESET are needed on macOS 158s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 158s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 158s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 158s raise 158s 158s # Reset the timeout for the recv() on the socket 158s read_timeout = timeout_obj.read_timeout 158s 158s if not conn.is_closed: 158s # In Python 3 socket.py will catch EAGAIN and return None when you 158s # try and read into the file pointer created by http.client, which 158s # instead raises a BadStatusLine exception. Instead of catching 158s # the exception and assuming all BadStatusLine exceptions are read 158s # timeouts, check for a zero timeout before making the request. 158s if read_timeout == 0: 158s raise ReadTimeoutError( 158s self, url, f"Read timed out. (read timeout={read_timeout})" 158s ) 158s conn.timeout = read_timeout 158s 158s # Receive the response from the server 158s try: 158s response = conn.getresponse() 158s except (BaseSSLError, OSError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 158s raise 158s 158s # Set properties that are used by the pooling layer. 158s response.retries = retries 158s response._connection = response_conn # type: ignore[attr-defined] 158s response._pool = self # type: ignore[attr-defined] 158s 158s log.debug( 158s '%s://%s:%s "%s %s %s" %s %s', 158s self.scheme, 158s self.host, 158s self.port, 158s method, 158s url, 158s > response.version_string, 158s response.status, 158s response.length_remaining, 158s ) 158s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 158s 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 158s ----------------------------- Captured stderr call ----------------------------- 158s 127.0.0.1 - - [18/Jan/2025 02:32:09] "GET / HTTP/1.1" 200 9358 158s _____________________________ test_domain_redirect _____________________________ 158s 158s def test_domain_redirect(): 158s """Ensure that redirects across domains are considered unique""" 158s # In this example, seomoz.org redirects to moz.com, and if those 158s # requests are considered identical, then we'll be stuck in a redirect 158s # loop. 158s url = "http://seomoz.org/" 158s with vcr.use_cassette("tests/fixtures/wild/domain_redirect.yaml") as cass: 158s > requests.get(url, headers={"User-Agent": "vcrpy-test"}) 158s 158s tests/integration/test_wild.py:20: 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s /usr/lib/python3/dist-packages/requests/api.py:73: in get 158s return request("get", url, params=params, **kwargs) 158s /usr/lib/python3/dist-packages/requests/api.py:59: in request 158s return session.request(method=method, url=url, **kwargs) 158s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 158s resp = self.send(prep, **send_kwargs) 158s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 158s r = adapter.send(request, **kwargs) 158s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 158s resp = conn.urlopen( 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 158s response = self._make_request( 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s 158s self = 158s conn = 158s method = 'GET', url = '/', body = None 158s headers = {'User-Agent': 'vcrpy-test', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 158s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 158s timeout = Timeout(connect=None, read=None, total=None), chunked = False 158s response_conn = 158s preload_content = False, decode_content = False, enforce_content_length = True 158s 158s def _make_request( 158s self, 158s conn: BaseHTTPConnection, 158s method: str, 158s url: str, 158s body: _TYPE_BODY | None = None, 158s headers: typing.Mapping[str, str] | None = None, 158s retries: Retry | None = None, 158s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 158s chunked: bool = False, 158s response_conn: BaseHTTPConnection | None = None, 158s preload_content: bool = True, 158s decode_content: bool = True, 158s enforce_content_length: bool = True, 158s ) -> BaseHTTPResponse: 158s """ 158s Perform a request on a given urllib connection object taken from our 158s pool. 158s 158s :param conn: 158s a connection from one of our connection pools 158s 158s :param method: 158s HTTP request method (such as GET, POST, PUT, etc.) 158s 158s :param url: 158s The URL to perform the request on. 158s 158s :param body: 158s Data to send in the request body, either :class:`str`, :class:`bytes`, 158s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 158s 158s :param headers: 158s Dictionary of custom headers to send, such as User-Agent, 158s If-None-Match, etc. If None, pool headers are used. If provided, 158s these headers completely replace any pool-specific headers. 158s 158s :param retries: 158s Configure the number of retries to allow before raising a 158s :class:`~urllib3.exceptions.MaxRetryError` exception. 158s 158s Pass ``None`` to retry until you receive a response. Pass a 158s :class:`~urllib3.util.retry.Retry` object for fine-grained control 158s over different types of retries. 158s Pass an integer number to retry connection errors that many times, 158s but no other types of errors. Pass zero to never retry. 158s 158s If ``False``, then retries are disabled and any exception is raised 158s immediately. Also, instead of raising a MaxRetryError on redirects, 158s the redirect response will be returned. 158s 158s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 158s 158s :param timeout: 158s If specified, overrides the default timeout for this one 158s request. It may be a float (in seconds) or an instance of 158s :class:`urllib3.util.Timeout`. 158s 158s :param chunked: 158s If True, urllib3 will send the body using chunked transfer 158s encoding. Otherwise, urllib3 will send the body using the standard 158s content-length form. Defaults to False. 158s 158s :param response_conn: 158s Set this to ``None`` if you will handle releasing the connection or 158s set the connection to have the response release it. 158s 158s :param preload_content: 158s If True, the response's body will be preloaded during construction. 158s 158s :param decode_content: 158s If True, will attempt to decode the body based on the 158s 'content-encoding' header. 158s 158s :param enforce_content_length: 158s Enforce content length checking. Body returned by server must match 158s value of Content-Length header, if present. Otherwise, raise error. 158s """ 158s self.num_requests += 1 158s 158s timeout_obj = self._get_timeout(timeout) 158s timeout_obj.start_connect() 158s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 158s 158s try: 158s # Trigger any extra validation we need to do. 158s try: 158s self._validate_conn(conn) 158s except (SocketTimeout, BaseSSLError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 158s raise 158s 158s # _validate_conn() starts the connection to an HTTPS proxy 158s # so we need to wrap errors with 'ProxyError' here too. 158s except ( 158s OSError, 158s NewConnectionError, 158s TimeoutError, 158s BaseSSLError, 158s CertificateError, 158s SSLError, 158s ) as e: 158s new_e: Exception = e 158s if isinstance(e, (BaseSSLError, CertificateError)): 158s new_e = SSLError(e) 158s # If the connection didn't successfully connect to it's proxy 158s # then there 158s if isinstance( 158s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 158s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 158s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 158s raise new_e 158s 158s # conn.request() calls http.client.*.request, not the method in 158s # urllib3.request. It also calls makefile (recv) on the socket. 158s try: 158s conn.request( 158s method, 158s url, 158s body=body, 158s headers=headers, 158s chunked=chunked, 158s preload_content=preload_content, 158s decode_content=decode_content, 158s enforce_content_length=enforce_content_length, 158s ) 158s 158s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 158s # legitimately able to close the connection after sending a valid response. 158s # With this behaviour, the received response is still readable. 158s except BrokenPipeError: 158s pass 158s except OSError as e: 158s # MacOS/Linux 158s # EPROTOTYPE and ECONNRESET are needed on macOS 158s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 158s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 158s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 158s raise 158s 158s # Reset the timeout for the recv() on the socket 158s read_timeout = timeout_obj.read_timeout 158s 158s if not conn.is_closed: 158s # In Python 3 socket.py will catch EAGAIN and return None when you 158s # try and read into the file pointer created by http.client, which 158s # instead raises a BadStatusLine exception. Instead of catching 158s # the exception and assuming all BadStatusLine exceptions are read 158s # timeouts, check for a zero timeout before making the request. 158s if read_timeout == 0: 158s raise ReadTimeoutError( 158s self, url, f"Read timed out. (read timeout={read_timeout})" 158s ) 158s conn.timeout = read_timeout 158s 158s # Receive the response from the server 158s try: 158s response = conn.getresponse() 158s except (BaseSSLError, OSError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 158s raise 158s 158s # Set properties that are used by the pooling layer. 158s response.retries = retries 158s response._connection = response_conn # type: ignore[attr-defined] 158s response._pool = self # type: ignore[attr-defined] 158s 158s log.debug( 158s '%s://%s:%s "%s %s %s" %s %s', 158s self.scheme, 158s self.host, 158s self.port, 158s method, 158s url, 158s > response.version_string, 158s response.status, 158s response.length_remaining, 158s ) 158s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 158s 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 158s _________________________________ test_cookies _________________________________ 158s 158s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_cookies0') 158s httpbin = 158s 158s def test_cookies(tmpdir, httpbin): 158s testfile = str(tmpdir.join("cookies.yml")) 158s with vcr.use_cassette(testfile): 158s with requests.Session() as s: 158s > s.get(httpbin.url + "/cookies/set?k1=v1&k2=v2") 158s 158s tests/integration/test_wild.py:67: 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 158s return self.request("GET", url, **kwargs) 158s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 158s resp = self.send(prep, **send_kwargs) 158s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 158s r = adapter.send(request, **kwargs) 158s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 158s resp = conn.urlopen( 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 158s response = self._make_request( 158s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 158s 158s self = 158s conn = 158s method = 'GET', url = '/cookies/set?k1=v1&k2=v2', body = None 158s headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 158s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 158s timeout = Timeout(connect=None, read=None, total=None), chunked = False 158s response_conn = 158s preload_content = False, decode_content = False, enforce_content_length = True 158s 158s def _make_request( 158s self, 158s conn: BaseHTTPConnection, 158s method: str, 158s url: str, 158s body: _TYPE_BODY | None = None, 158s headers: typing.Mapping[str, str] | None = None, 158s retries: Retry | None = None, 158s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 158s chunked: bool = False, 158s response_conn: BaseHTTPConnection | None = None, 158s preload_content: bool = True, 158s decode_content: bool = True, 158s enforce_content_length: bool = True, 158s ) -> BaseHTTPResponse: 158s """ 158s Perform a request on a given urllib connection object taken from our 158s pool. 158s 158s :param conn: 158s a connection from one of our connection pools 158s 158s :param method: 158s HTTP request method (such as GET, POST, PUT, etc.) 158s 158s :param url: 158s The URL to perform the request on. 158s 158s :param body: 158s Data to send in the request body, either :class:`str`, :class:`bytes`, 158s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 158s 158s :param headers: 158s Dictionary of custom headers to send, such as User-Agent, 158s If-None-Match, etc. If None, pool headers are used. If provided, 158s these headers completely replace any pool-specific headers. 158s 158s :param retries: 158s Configure the number of retries to allow before raising a 158s :class:`~urllib3.exceptions.MaxRetryError` exception. 158s 158s Pass ``None`` to retry until you receive a response. Pass a 158s :class:`~urllib3.util.retry.Retry` object for fine-grained control 158s over different types of retries. 158s Pass an integer number to retry connection errors that many times, 158s but no other types of errors. Pass zero to never retry. 158s 158s If ``False``, then retries are disabled and any exception is raised 158s immediately. Also, instead of raising a MaxRetryError on redirects, 158s the redirect response will be returned. 158s 158s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 158s 158s :param timeout: 158s If specified, overrides the default timeout for this one 158s request. It may be a float (in seconds) or an instance of 158s :class:`urllib3.util.Timeout`. 158s 158s :param chunked: 158s If True, urllib3 will send the body using chunked transfer 158s encoding. Otherwise, urllib3 will send the body using the standard 158s content-length form. Defaults to False. 158s 158s :param response_conn: 158s Set this to ``None`` if you will handle releasing the connection or 158s set the connection to have the response release it. 158s 158s :param preload_content: 158s If True, the response's body will be preloaded during construction. 158s 158s :param decode_content: 158s If True, will attempt to decode the body based on the 158s 'content-encoding' header. 158s 158s :param enforce_content_length: 158s Enforce content length checking. Body returned by server must match 158s value of Content-Length header, if present. Otherwise, raise error. 158s """ 158s self.num_requests += 1 158s 158s timeout_obj = self._get_timeout(timeout) 158s timeout_obj.start_connect() 158s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 158s 158s try: 158s # Trigger any extra validation we need to do. 158s try: 158s self._validate_conn(conn) 158s except (SocketTimeout, BaseSSLError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 158s raise 158s 158s # _validate_conn() starts the connection to an HTTPS proxy 158s # so we need to wrap errors with 'ProxyError' here too. 158s except ( 158s OSError, 158s NewConnectionError, 158s TimeoutError, 158s BaseSSLError, 158s CertificateError, 158s SSLError, 158s ) as e: 158s new_e: Exception = e 158s if isinstance(e, (BaseSSLError, CertificateError)): 158s new_e = SSLError(e) 158s # If the connection didn't successfully connect to it's proxy 158s # then there 158s if isinstance( 158s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 158s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 158s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 158s raise new_e 158s 158s # conn.request() calls http.client.*.request, not the method in 158s # urllib3.request. It also calls makefile (recv) on the socket. 158s try: 158s conn.request( 158s method, 158s url, 158s body=body, 158s headers=headers, 158s chunked=chunked, 158s preload_content=preload_content, 158s decode_content=decode_content, 158s enforce_content_length=enforce_content_length, 158s ) 158s 158s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 158s # legitimately able to close the connection after sending a valid response. 158s # With this behaviour, the received response is still readable. 158s except BrokenPipeError: 158s pass 158s except OSError as e: 158s # MacOS/Linux 158s # EPROTOTYPE and ECONNRESET are needed on macOS 158s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 158s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 158s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 158s raise 158s 158s # Reset the timeout for the recv() on the socket 158s read_timeout = timeout_obj.read_timeout 158s 158s if not conn.is_closed: 158s # In Python 3 socket.py will catch EAGAIN and return None when you 158s # try and read into the file pointer created by http.client, which 158s # instead raises a BadStatusLine exception. Instead of catching 158s # the exception and assuming all BadStatusLine exceptions are read 158s # timeouts, check for a zero timeout before making the request. 158s if read_timeout == 0: 158s raise ReadTimeoutError( 158s self, url, f"Read timed out. (read timeout={read_timeout})" 158s ) 158s conn.timeout = read_timeout 158s 158s # Receive the response from the server 158s try: 158s response = conn.getresponse() 158s except (BaseSSLError, OSError) as e: 158s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 158s raise 158s 158s # Set properties that are used by the pooling layer. 158s response.retries = retries 158s response._connection = response_conn # type: ignore[attr-defined] 158s response._pool = self # type: ignore[attr-defined] 158s 158s log.debug( 158s '%s://%s:%s "%s %s %s" %s %s', 158s self.scheme, 158s self.host, 158s self.port, 158s method, 158s url, 158s > response.version_string, 158s response.status, 158s response.length_remaining, 158s ) 158s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 158s 158s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 158s ----------------------------- Captured stderr call ----------------------------- 158s 127.0.0.1 - - [18/Jan/2025 02:32:09] "GET /cookies/set?k1=v1&k2=v2 HTTP/1.1" 302 203 158s =============================== warnings summary =============================== 158s tests/integration/test_config.py:10 158s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_config.py:10: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 158s @pytest.mark.online 158s 158s tests/integration/test_config.py:24 158s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_config.py:24: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 158s @pytest.mark.online 158s 158s tests/integration/test_config.py:34 158s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_config.py:34: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 158s @pytest.mark.online 158s 158s tests/integration/test_config.py:47 158s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_config.py:47: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 158s @pytest.mark.online 158s 158s tests/integration/test_config.py:69 158s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_config.py:69: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 158s @pytest.mark.online 158s 158s tests/integration/test_disksaver.py:14 158s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_disksaver.py:14: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 158s @pytest.mark.online 158s 158s tests/integration/test_disksaver.py:35 158s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_disksaver.py:35: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 158s @pytest.mark.online 158s 158s tests/integration/test_httplib2.py:60 158s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_httplib2.py:60: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 158s @pytest.mark.online 158s 158s tests/integration/test_register_matcher.py:16 158s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_register_matcher.py:16: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 158s @pytest.mark.online 158s 158s tests/integration/test_register_matcher.py:32 158s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_register_matcher.py:32: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 158s @pytest.mark.online 158s 158s tests/integration/test_urllib2.py:60 158s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_urllib2.py:60: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 158s @mark.online 158s 158s tests/integration/test_urllib3.py:102 158s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_urllib3.py:102: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 158s @pytest.mark.online 158s 158s tests/integration/test_wild.py:55 158s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_wild.py:55: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 158s @pytest.mark.online 158s 158s tests/integration/test_wild.py:74 158s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_wild.py:74: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 158s @pytest.mark.online 158s 158s tests/unit/test_stubs.py:20 158s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/unit/test_stubs.py:20: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 158s @mark.online 158s 158s tests/unit/test_unittest.py:131 158s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/unit/test_unittest.py:131: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 158s @pytest.mark.online 158s 158s tests/unit/test_unittest.py:166 158s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/unit/test_unittest.py:166: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 158s @pytest.mark.online 158s 158s tests/integration/test_wild.py::test_xmlrpclib 158s /usr/lib/python3.13/multiprocessing/popen_fork.py:67: DeprecationWarning: This process (pid=2904) is multi-threaded, use of fork() may lead to deadlocks in the child. 158s self.pid = os.fork() 158s 158s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 158s =========================== short test summary info ============================ 158s FAILED tests/integration/test_urllib3.py::test_status_code[http] - AttributeE... 158s FAILED tests/integration/test_urllib3.py::test_headers[http] - AttributeError... 158s FAILED tests/integration/test_urllib3.py::test_body[http] - AttributeError: '... 158s FAILED tests/integration/test_urllib3.py::test_auth[http] - AttributeError: '... 158s FAILED tests/integration/test_urllib3.py::test_auth_failed[http] - AttributeE... 158s FAILED tests/integration/test_urllib3.py::test_post[http] - AttributeError: '... 158s FAILED tests/integration/test_urllib3.py::test_gzip[http] - AttributeError: '... 158s FAILED tests/integration/test_urllib3.py::test_status_code[https] - Attribute... 158s FAILED tests/integration/test_urllib3.py::test_headers[https] - AttributeErro... 158s FAILED tests/integration/test_urllib3.py::test_body[https] - AttributeError: ... 158s FAILED tests/integration/test_urllib3.py::test_auth[https] - AttributeError: ... 158s FAILED tests/integration/test_urllib3.py::test_auth_failed[https] - Attribute... 158s FAILED tests/integration/test_urllib3.py::test_post[https] - AttributeError: ... 158s FAILED tests/integration/test_urllib3.py::test_gzip[https] - AttributeError: ... 158s FAILED tests/integration/test_proxy.py::test_use_proxy - AttributeError: 'VCR... 158s FAILED tests/integration/test_urllib3.py::test_cross_scheme - AttributeError:... 158s FAILED tests/integration/test_urllib3.py::test_https_with_cert_validation_disabled 158s FAILED tests/integration/test_wild.py::test_domain_redirect - AttributeError:... 158s FAILED tests/integration/test_wild.py::test_cookies - AttributeError: 'VCRHTT... 158s ==== 19 failed, 265 passed, 3 skipped, 19 deselected, 18 warnings in 2.80s ===== 158s E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build; python3.13 -m pytest --ignore tests/integration/test_aiohttp.py --ignore tests/integration/test_tornado.py --ignore tests/integration/test_requests.py -m "not online" -k "not test_basic_json_use and not test_load_cassette_with_custom_persister" 158s I: pybuild base:311: cd /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build; python3.12 -m pytest --ignore tests/integration/test_aiohttp.py --ignore tests/integration/test_tornado.py --ignore tests/integration/test_requests.py -m "not online" -k "not test_basic_json_use and not test_load_cassette_with_custom_persister" 159s ============================= test session starts ============================== 159s platform linux -- Python 3.12.8, pytest-8.3.4, pluggy-1.5.0 159s rootdir: /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build 159s plugins: tornado-0.8.1, httpbin-2.1.0, typeguard-4.4.1 159s collected 305 items / 19 deselected / 1 skipped / 286 selected 159s 159s tests/integration/test_basic.py .... [ 1%] 159s tests/integration/test_boto3.py ss [ 2%] 159s tests/integration/test_config.py . [ 2%] 159s tests/integration/test_filter.py .......... [ 5%] 159s tests/integration/test_httplib2.py ........ [ 8%] 159s tests/integration/test_urllib2.py ........ [ 11%] 159s tests/integration/test_urllib3.py FFFFFFF [ 13%] 159s tests/integration/test_httplib2.py ........ [ 16%] 159s tests/integration/test_urllib2.py ........ [ 19%] 160s tests/integration/test_urllib3.py FFFFFFF [ 22%] 160s tests/integration/test_httplib2.py . [ 22%] 160s tests/integration/test_ignore.py .... [ 23%] 160s tests/integration/test_matchers.py .............. [ 28%] 160s tests/integration/test_multiple.py . [ 29%] 160s tests/integration/test_proxy.py F [ 29%] 160s tests/integration/test_record_mode.py ........ [ 32%] 160s tests/integration/test_register_persister.py .. [ 32%] 160s tests/integration/test_register_serializer.py . [ 33%] 160s tests/integration/test_request.py .. [ 33%] 160s tests/integration/test_stubs.py .... [ 35%] 160s tests/integration/test_urllib2.py . [ 35%] 160s tests/integration/test_urllib3.py FF. [ 36%] 160s tests/integration/test_wild.py F.F. [ 38%] 160s tests/unit/test_cassettes.py ............................... [ 48%] 160s tests/unit/test_errors.py .... [ 50%] 160s tests/unit/test_filters.py ........................ [ 58%] 160s tests/unit/test_json_serializer.py . [ 59%] 160s tests/unit/test_matchers.py ............................ [ 68%] 160s tests/unit/test_migration.py ... [ 69%] 160s tests/unit/test_persist.py .... [ 71%] 160s tests/unit/test_request.py ................. [ 77%] 160s tests/unit/test_response.py .... [ 78%] 160s tests/unit/test_serialize.py ............... [ 83%] 160s tests/unit/test_stubs.py ... [ 84%] 160s tests/unit/test_unittest.py ....... [ 87%] 160s tests/unit/test_util.py ........... [ 91%] 160s tests/unit/test_vcr.py ........................ [ 99%] 161s tests/unit/test_vcr_import.py . [100%] 161s 161s =================================== FAILURES =================================== 161s ____________________________ test_status_code[http] ____________________________ 161s 161s httpbin_both = 161s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_status_code_http_0') 161s verify_pool_mgr = 161s 161s def test_status_code(httpbin_both, tmpdir, verify_pool_mgr): 161s """Ensure that we can read the status code""" 161s url = httpbin_both.url 161s with vcr.use_cassette(str(tmpdir.join("atts.yaml"))): 161s > status_code = verify_pool_mgr.request("GET", url).status 161s 161s tests/integration/test_urllib3.py:34: 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 161s return self.request_encode_url( 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 161s return self.urlopen(method, url, **extra_kw) 161s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 161s response = conn.urlopen(method, u.request_uri, **kw) 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 161s response = self._make_request( 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s 161s self = 161s conn = 161s method = 'GET', url = '/', body = None, headers = {} 161s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 161s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 161s chunked = False, response_conn = None, preload_content = True 161s decode_content = True, enforce_content_length = True 161s 161s def _make_request( 161s self, 161s conn: BaseHTTPConnection, 161s method: str, 161s url: str, 161s body: _TYPE_BODY | None = None, 161s headers: typing.Mapping[str, str] | None = None, 161s retries: Retry | None = None, 161s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 161s chunked: bool = False, 161s response_conn: BaseHTTPConnection | None = None, 161s preload_content: bool = True, 161s decode_content: bool = True, 161s enforce_content_length: bool = True, 161s ) -> BaseHTTPResponse: 161s """ 161s Perform a request on a given urllib connection object taken from our 161s pool. 161s 161s :param conn: 161s a connection from one of our connection pools 161s 161s :param method: 161s HTTP request method (such as GET, POST, PUT, etc.) 161s 161s :param url: 161s The URL to perform the request on. 161s 161s :param body: 161s Data to send in the request body, either :class:`str`, :class:`bytes`, 161s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 161s 161s :param headers: 161s Dictionary of custom headers to send, such as User-Agent, 161s If-None-Match, etc. If None, pool headers are used. If provided, 161s these headers completely replace any pool-specific headers. 161s 161s :param retries: 161s Configure the number of retries to allow before raising a 161s :class:`~urllib3.exceptions.MaxRetryError` exception. 161s 161s Pass ``None`` to retry until you receive a response. Pass a 161s :class:`~urllib3.util.retry.Retry` object for fine-grained control 161s over different types of retries. 161s Pass an integer number to retry connection errors that many times, 161s but no other types of errors. Pass zero to never retry. 161s 161s If ``False``, then retries are disabled and any exception is raised 161s immediately. Also, instead of raising a MaxRetryError on redirects, 161s the redirect response will be returned. 161s 161s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 161s 161s :param timeout: 161s If specified, overrides the default timeout for this one 161s request. It may be a float (in seconds) or an instance of 161s :class:`urllib3.util.Timeout`. 161s 161s :param chunked: 161s If True, urllib3 will send the body using chunked transfer 161s encoding. Otherwise, urllib3 will send the body using the standard 161s content-length form. Defaults to False. 161s 161s :param response_conn: 161s Set this to ``None`` if you will handle releasing the connection or 161s set the connection to have the response release it. 161s 161s :param preload_content: 161s If True, the response's body will be preloaded during construction. 161s 161s :param decode_content: 161s If True, will attempt to decode the body based on the 161s 'content-encoding' header. 161s 161s :param enforce_content_length: 161s Enforce content length checking. Body returned by server must match 161s value of Content-Length header, if present. Otherwise, raise error. 161s """ 161s self.num_requests += 1 161s 161s timeout_obj = self._get_timeout(timeout) 161s timeout_obj.start_connect() 161s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 161s 161s try: 161s # Trigger any extra validation we need to do. 161s try: 161s self._validate_conn(conn) 161s except (SocketTimeout, BaseSSLError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 161s raise 161s 161s # _validate_conn() starts the connection to an HTTPS proxy 161s # so we need to wrap errors with 'ProxyError' here too. 161s except ( 161s OSError, 161s NewConnectionError, 161s TimeoutError, 161s BaseSSLError, 161s CertificateError, 161s SSLError, 161s ) as e: 161s new_e: Exception = e 161s if isinstance(e, (BaseSSLError, CertificateError)): 161s new_e = SSLError(e) 161s # If the connection didn't successfully connect to it's proxy 161s # then there 161s if isinstance( 161s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 161s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 161s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 161s raise new_e 161s 161s # conn.request() calls http.client.*.request, not the method in 161s # urllib3.request. It also calls makefile (recv) on the socket. 161s try: 161s conn.request( 161s method, 161s url, 161s body=body, 161s headers=headers, 161s chunked=chunked, 161s preload_content=preload_content, 161s decode_content=decode_content, 161s enforce_content_length=enforce_content_length, 161s ) 161s 161s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 161s # legitimately able to close the connection after sending a valid response. 161s # With this behaviour, the received response is still readable. 161s except BrokenPipeError: 161s pass 161s except OSError as e: 161s # MacOS/Linux 161s # EPROTOTYPE and ECONNRESET are needed on macOS 161s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 161s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 161s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 161s raise 161s 161s # Reset the timeout for the recv() on the socket 161s read_timeout = timeout_obj.read_timeout 161s 161s if not conn.is_closed: 161s # In Python 3 socket.py will catch EAGAIN and return None when you 161s # try and read into the file pointer created by http.client, which 161s # instead raises a BadStatusLine exception. Instead of catching 161s # the exception and assuming all BadStatusLine exceptions are read 161s # timeouts, check for a zero timeout before making the request. 161s if read_timeout == 0: 161s raise ReadTimeoutError( 161s self, url, f"Read timed out. (read timeout={read_timeout})" 161s ) 161s conn.timeout = read_timeout 161s 161s # Receive the response from the server 161s try: 161s response = conn.getresponse() 161s except (BaseSSLError, OSError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 161s raise 161s 161s # Set properties that are used by the pooling layer. 161s response.retries = retries 161s response._connection = response_conn # type: ignore[attr-defined] 161s response._pool = self # type: ignore[attr-defined] 161s 161s log.debug( 161s '%s://%s:%s "%s %s %s" %s %s', 161s self.scheme, 161s self.host, 161s self.port, 161s method, 161s url, 161s > response.version_string, 161s response.status, 161s response.length_remaining, 161s ) 161s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 161s 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 161s ----------------------------- Captured stderr call ----------------------------- 161s 127.0.0.1 - - [18/Jan/2025 02:32:11] "GET / HTTP/1.1" 200 9358 161s ______________________________ test_headers[http] ______________________________ 161s 161s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_headers_http_0') 161s httpbin_both = 161s verify_pool_mgr = 161s 161s def test_headers(tmpdir, httpbin_both, verify_pool_mgr): 161s """Ensure that we can read the headers back""" 161s url = httpbin_both.url 161s with vcr.use_cassette(str(tmpdir.join("headers.yaml"))): 161s > headers = verify_pool_mgr.request("GET", url).headers 161s 161s tests/integration/test_urllib3.py:44: 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 161s return self.request_encode_url( 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 161s return self.urlopen(method, url, **extra_kw) 161s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 161s response = conn.urlopen(method, u.request_uri, **kw) 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 161s response = self._make_request( 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s 161s self = 161s conn = 161s method = 'GET', url = '/', body = None, headers = {} 161s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 161s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 161s chunked = False, response_conn = None, preload_content = True 161s decode_content = True, enforce_content_length = True 161s 161s def _make_request( 161s self, 161s conn: BaseHTTPConnection, 161s method: str, 161s url: str, 161s body: _TYPE_BODY | None = None, 161s headers: typing.Mapping[str, str] | None = None, 161s retries: Retry | None = None, 161s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 161s chunked: bool = False, 161s response_conn: BaseHTTPConnection | None = None, 161s preload_content: bool = True, 161s decode_content: bool = True, 161s enforce_content_length: bool = True, 161s ) -> BaseHTTPResponse: 161s """ 161s Perform a request on a given urllib connection object taken from our 161s pool. 161s 161s :param conn: 161s a connection from one of our connection pools 161s 161s :param method: 161s HTTP request method (such as GET, POST, PUT, etc.) 161s 161s :param url: 161s The URL to perform the request on. 161s 161s :param body: 161s Data to send in the request body, either :class:`str`, :class:`bytes`, 161s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 161s 161s :param headers: 161s Dictionary of custom headers to send, such as User-Agent, 161s If-None-Match, etc. If None, pool headers are used. If provided, 161s these headers completely replace any pool-specific headers. 161s 161s :param retries: 161s Configure the number of retries to allow before raising a 161s :class:`~urllib3.exceptions.MaxRetryError` exception. 161s 161s Pass ``None`` to retry until you receive a response. Pass a 161s :class:`~urllib3.util.retry.Retry` object for fine-grained control 161s over different types of retries. 161s Pass an integer number to retry connection errors that many times, 161s but no other types of errors. Pass zero to never retry. 161s 161s If ``False``, then retries are disabled and any exception is raised 161s immediately. Also, instead of raising a MaxRetryError on redirects, 161s the redirect response will be returned. 161s 161s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 161s 161s :param timeout: 161s If specified, overrides the default timeout for this one 161s request. It may be a float (in seconds) or an instance of 161s :class:`urllib3.util.Timeout`. 161s 161s :param chunked: 161s If True, urllib3 will send the body using chunked transfer 161s encoding. Otherwise, urllib3 will send the body using the standard 161s content-length form. Defaults to False. 161s 161s :param response_conn: 161s Set this to ``None`` if you will handle releasing the connection or 161s set the connection to have the response release it. 161s 161s :param preload_content: 161s If True, the response's body will be preloaded during construction. 161s 161s :param decode_content: 161s If True, will attempt to decode the body based on the 161s 'content-encoding' header. 161s 161s :param enforce_content_length: 161s Enforce content length checking. Body returned by server must match 161s value of Content-Length header, if present. Otherwise, raise error. 161s """ 161s self.num_requests += 1 161s 161s timeout_obj = self._get_timeout(timeout) 161s timeout_obj.start_connect() 161s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 161s 161s try: 161s # Trigger any extra validation we need to do. 161s try: 161s self._validate_conn(conn) 161s except (SocketTimeout, BaseSSLError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 161s raise 161s 161s # _validate_conn() starts the connection to an HTTPS proxy 161s # so we need to wrap errors with 'ProxyError' here too. 161s except ( 161s OSError, 161s NewConnectionError, 161s TimeoutError, 161s BaseSSLError, 161s CertificateError, 161s SSLError, 161s ) as e: 161s new_e: Exception = e 161s if isinstance(e, (BaseSSLError, CertificateError)): 161s new_e = SSLError(e) 161s # If the connection didn't successfully connect to it's proxy 161s # then there 161s if isinstance( 161s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 161s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 161s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 161s raise new_e 161s 161s # conn.request() calls http.client.*.request, not the method in 161s # urllib3.request. It also calls makefile (recv) on the socket. 161s try: 161s conn.request( 161s method, 161s url, 161s body=body, 161s headers=headers, 161s chunked=chunked, 161s preload_content=preload_content, 161s decode_content=decode_content, 161s enforce_content_length=enforce_content_length, 161s ) 161s 161s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 161s # legitimately able to close the connection after sending a valid response. 161s # With this behaviour, the received response is still readable. 161s except BrokenPipeError: 161s pass 161s except OSError as e: 161s # MacOS/Linux 161s # EPROTOTYPE and ECONNRESET are needed on macOS 161s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 161s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 161s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 161s raise 161s 161s # Reset the timeout for the recv() on the socket 161s read_timeout = timeout_obj.read_timeout 161s 161s if not conn.is_closed: 161s # In Python 3 socket.py will catch EAGAIN and return None when you 161s # try and read into the file pointer created by http.client, which 161s # instead raises a BadStatusLine exception. Instead of catching 161s # the exception and assuming all BadStatusLine exceptions are read 161s # timeouts, check for a zero timeout before making the request. 161s if read_timeout == 0: 161s raise ReadTimeoutError( 161s self, url, f"Read timed out. (read timeout={read_timeout})" 161s ) 161s conn.timeout = read_timeout 161s 161s # Receive the response from the server 161s try: 161s response = conn.getresponse() 161s except (BaseSSLError, OSError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 161s raise 161s 161s # Set properties that are used by the pooling layer. 161s response.retries = retries 161s response._connection = response_conn # type: ignore[attr-defined] 161s response._pool = self # type: ignore[attr-defined] 161s 161s log.debug( 161s '%s://%s:%s "%s %s %s" %s %s', 161s self.scheme, 161s self.host, 161s self.port, 161s method, 161s url, 161s > response.version_string, 161s response.status, 161s response.length_remaining, 161s ) 161s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 161s 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 161s ----------------------------- Captured stderr call ----------------------------- 161s 127.0.0.1 - - [18/Jan/2025 02:32:11] "GET / HTTP/1.1" 200 9358 161s _______________________________ test_body[http] ________________________________ 161s 161s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_body_http_0') 161s httpbin_both = 161s verify_pool_mgr = 161s 161s def test_body(tmpdir, httpbin_both, verify_pool_mgr): 161s """Ensure the responses are all identical enough""" 161s url = httpbin_both.url + "/bytes/1024" 161s with vcr.use_cassette(str(tmpdir.join("body.yaml"))): 161s > content = verify_pool_mgr.request("GET", url).data 161s 161s tests/integration/test_urllib3.py:55: 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 161s return self.request_encode_url( 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 161s return self.urlopen(method, url, **extra_kw) 161s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 161s response = conn.urlopen(method, u.request_uri, **kw) 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 161s response = self._make_request( 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s 161s self = 161s conn = 161s method = 'GET', url = '/bytes/1024', body = None, headers = {} 161s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 161s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 161s chunked = False, response_conn = None, preload_content = True 161s decode_content = True, enforce_content_length = True 161s 161s def _make_request( 161s self, 161s conn: BaseHTTPConnection, 161s method: str, 161s url: str, 161s body: _TYPE_BODY | None = None, 161s headers: typing.Mapping[str, str] | None = None, 161s retries: Retry | None = None, 161s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 161s chunked: bool = False, 161s response_conn: BaseHTTPConnection | None = None, 161s preload_content: bool = True, 161s decode_content: bool = True, 161s enforce_content_length: bool = True, 161s ) -> BaseHTTPResponse: 161s """ 161s Perform a request on a given urllib connection object taken from our 161s pool. 161s 161s :param conn: 161s a connection from one of our connection pools 161s 161s :param method: 161s HTTP request method (such as GET, POST, PUT, etc.) 161s 161s :param url: 161s The URL to perform the request on. 161s 161s :param body: 161s Data to send in the request body, either :class:`str`, :class:`bytes`, 161s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 161s 161s :param headers: 161s Dictionary of custom headers to send, such as User-Agent, 161s If-None-Match, etc. If None, pool headers are used. If provided, 161s these headers completely replace any pool-specific headers. 161s 161s :param retries: 161s Configure the number of retries to allow before raising a 161s :class:`~urllib3.exceptions.MaxRetryError` exception. 161s 161s Pass ``None`` to retry until you receive a response. Pass a 161s :class:`~urllib3.util.retry.Retry` object for fine-grained control 161s over different types of retries. 161s Pass an integer number to retry connection errors that many times, 161s but no other types of errors. Pass zero to never retry. 161s 161s If ``False``, then retries are disabled and any exception is raised 161s immediately. Also, instead of raising a MaxRetryError on redirects, 161s the redirect response will be returned. 161s 161s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 161s 161s :param timeout: 161s If specified, overrides the default timeout for this one 161s request. It may be a float (in seconds) or an instance of 161s :class:`urllib3.util.Timeout`. 161s 161s :param chunked: 161s If True, urllib3 will send the body using chunked transfer 161s encoding. Otherwise, urllib3 will send the body using the standard 161s content-length form. Defaults to False. 161s 161s :param response_conn: 161s Set this to ``None`` if you will handle releasing the connection or 161s set the connection to have the response release it. 161s 161s :param preload_content: 161s If True, the response's body will be preloaded during construction. 161s 161s :param decode_content: 161s If True, will attempt to decode the body based on the 161s 'content-encoding' header. 161s 161s :param enforce_content_length: 161s Enforce content length checking. Body returned by server must match 161s value of Content-Length header, if present. Otherwise, raise error. 161s """ 161s self.num_requests += 1 161s 161s timeout_obj = self._get_timeout(timeout) 161s timeout_obj.start_connect() 161s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 161s 161s try: 161s # Trigger any extra validation we need to do. 161s try: 161s self._validate_conn(conn) 161s except (SocketTimeout, BaseSSLError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 161s raise 161s 161s # _validate_conn() starts the connection to an HTTPS proxy 161s # so we need to wrap errors with 'ProxyError' here too. 161s except ( 161s OSError, 161s NewConnectionError, 161s TimeoutError, 161s BaseSSLError, 161s CertificateError, 161s SSLError, 161s ) as e: 161s new_e: Exception = e 161s if isinstance(e, (BaseSSLError, CertificateError)): 161s new_e = SSLError(e) 161s # If the connection didn't successfully connect to it's proxy 161s # then there 161s if isinstance( 161s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 161s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 161s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 161s raise new_e 161s 161s # conn.request() calls http.client.*.request, not the method in 161s # urllib3.request. It also calls makefile (recv) on the socket. 161s try: 161s conn.request( 161s method, 161s url, 161s body=body, 161s headers=headers, 161s chunked=chunked, 161s preload_content=preload_content, 161s decode_content=decode_content, 161s enforce_content_length=enforce_content_length, 161s ) 161s 161s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 161s # legitimately able to close the connection after sending a valid response. 161s # With this behaviour, the received response is still readable. 161s except BrokenPipeError: 161s pass 161s except OSError as e: 161s # MacOS/Linux 161s # EPROTOTYPE and ECONNRESET are needed on macOS 161s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 161s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 161s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 161s raise 161s 161s # Reset the timeout for the recv() on the socket 161s read_timeout = timeout_obj.read_timeout 161s 161s if not conn.is_closed: 161s # In Python 3 socket.py will catch EAGAIN and return None when you 161s # try and read into the file pointer created by http.client, which 161s # instead raises a BadStatusLine exception. Instead of catching 161s # the exception and assuming all BadStatusLine exceptions are read 161s # timeouts, check for a zero timeout before making the request. 161s if read_timeout == 0: 161s raise ReadTimeoutError( 161s self, url, f"Read timed out. (read timeout={read_timeout})" 161s ) 161s conn.timeout = read_timeout 161s 161s # Receive the response from the server 161s try: 161s response = conn.getresponse() 161s except (BaseSSLError, OSError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 161s raise 161s 161s # Set properties that are used by the pooling layer. 161s response.retries = retries 161s response._connection = response_conn # type: ignore[attr-defined] 161s response._pool = self # type: ignore[attr-defined] 161s 161s log.debug( 161s '%s://%s:%s "%s %s %s" %s %s', 161s self.scheme, 161s self.host, 161s self.port, 161s method, 161s url, 161s > response.version_string, 161s response.status, 161s response.length_remaining, 161s ) 161s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 161s 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 161s ----------------------------- Captured stderr call ----------------------------- 161s 127.0.0.1 - - [18/Jan/2025 02:32:11] "GET /bytes/1024 HTTP/1.1" 200 1024 161s _______________________________ test_auth[http] ________________________________ 161s 161s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_auth_http_0') 161s httpbin_both = 161s verify_pool_mgr = 161s 161s def test_auth(tmpdir, httpbin_both, verify_pool_mgr): 161s """Ensure that we can handle basic auth""" 161s auth = ("user", "passwd") 161s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 161s url = httpbin_both.url + "/basic-auth/user/passwd" 161s with vcr.use_cassette(str(tmpdir.join("auth.yaml"))): 161s > one = verify_pool_mgr.request("GET", url, headers=headers) 161s 161s tests/integration/test_urllib3.py:67: 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 161s return self.request_encode_url( 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 161s return self.urlopen(method, url, **extra_kw) 161s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 161s response = conn.urlopen(method, u.request_uri, **kw) 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 161s response = self._make_request( 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s 161s self = 161s conn = 161s method = 'GET', url = '/basic-auth/user/passwd', body = None 161s headers = {'authorization': 'Basic dXNlcjpwYXNzd2Q='} 161s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 161s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 161s chunked = False, response_conn = None, preload_content = True 161s decode_content = True, enforce_content_length = True 161s 161s def _make_request( 161s self, 161s conn: BaseHTTPConnection, 161s method: str, 161s url: str, 161s body: _TYPE_BODY | None = None, 161s headers: typing.Mapping[str, str] | None = None, 161s retries: Retry | None = None, 161s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 161s chunked: bool = False, 161s response_conn: BaseHTTPConnection | None = None, 161s preload_content: bool = True, 161s decode_content: bool = True, 161s enforce_content_length: bool = True, 161s ) -> BaseHTTPResponse: 161s """ 161s Perform a request on a given urllib connection object taken from our 161s pool. 161s 161s :param conn: 161s a connection from one of our connection pools 161s 161s :param method: 161s HTTP request method (such as GET, POST, PUT, etc.) 161s 161s :param url: 161s The URL to perform the request on. 161s 161s :param body: 161s Data to send in the request body, either :class:`str`, :class:`bytes`, 161s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 161s 161s :param headers: 161s Dictionary of custom headers to send, such as User-Agent, 161s If-None-Match, etc. If None, pool headers are used. If provided, 161s these headers completely replace any pool-specific headers. 161s 161s :param retries: 161s Configure the number of retries to allow before raising a 161s :class:`~urllib3.exceptions.MaxRetryError` exception. 161s 161s Pass ``None`` to retry until you receive a response. Pass a 161s :class:`~urllib3.util.retry.Retry` object for fine-grained control 161s over different types of retries. 161s Pass an integer number to retry connection errors that many times, 161s but no other types of errors. Pass zero to never retry. 161s 161s If ``False``, then retries are disabled and any exception is raised 161s immediately. Also, instead of raising a MaxRetryError on redirects, 161s the redirect response will be returned. 161s 161s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 161s 161s :param timeout: 161s If specified, overrides the default timeout for this one 161s request. It may be a float (in seconds) or an instance of 161s :class:`urllib3.util.Timeout`. 161s 161s :param chunked: 161s If True, urllib3 will send the body using chunked transfer 161s encoding. Otherwise, urllib3 will send the body using the standard 161s content-length form. Defaults to False. 161s 161s :param response_conn: 161s Set this to ``None`` if you will handle releasing the connection or 161s set the connection to have the response release it. 161s 161s :param preload_content: 161s If True, the response's body will be preloaded during construction. 161s 161s :param decode_content: 161s If True, will attempt to decode the body based on the 161s 'content-encoding' header. 161s 161s :param enforce_content_length: 161s Enforce content length checking. Body returned by server must match 161s value of Content-Length header, if present. Otherwise, raise error. 161s """ 161s self.num_requests += 1 161s 161s timeout_obj = self._get_timeout(timeout) 161s timeout_obj.start_connect() 161s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 161s 161s try: 161s # Trigger any extra validation we need to do. 161s try: 161s self._validate_conn(conn) 161s except (SocketTimeout, BaseSSLError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 161s raise 161s 161s # _validate_conn() starts the connection to an HTTPS proxy 161s # so we need to wrap errors with 'ProxyError' here too. 161s except ( 161s OSError, 161s NewConnectionError, 161s TimeoutError, 161s BaseSSLError, 161s CertificateError, 161s SSLError, 161s ) as e: 161s new_e: Exception = e 161s if isinstance(e, (BaseSSLError, CertificateError)): 161s new_e = SSLError(e) 161s # If the connection didn't successfully connect to it's proxy 161s # then there 161s if isinstance( 161s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 161s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 161s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 161s raise new_e 161s 161s # conn.request() calls http.client.*.request, not the method in 161s # urllib3.request. It also calls makefile (recv) on the socket. 161s try: 161s conn.request( 161s method, 161s url, 161s body=body, 161s headers=headers, 161s chunked=chunked, 161s preload_content=preload_content, 161s decode_content=decode_content, 161s enforce_content_length=enforce_content_length, 161s ) 161s 161s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 161s # legitimately able to close the connection after sending a valid response. 161s # With this behaviour, the received response is still readable. 161s except BrokenPipeError: 161s pass 161s except OSError as e: 161s # MacOS/Linux 161s # EPROTOTYPE and ECONNRESET are needed on macOS 161s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 161s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 161s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 161s raise 161s 161s # Reset the timeout for the recv() on the socket 161s read_timeout = timeout_obj.read_timeout 161s 161s if not conn.is_closed: 161s # In Python 3 socket.py will catch EAGAIN and return None when you 161s # try and read into the file pointer created by http.client, which 161s # instead raises a BadStatusLine exception. Instead of catching 161s # the exception and assuming all BadStatusLine exceptions are read 161s # timeouts, check for a zero timeout before making the request. 161s if read_timeout == 0: 161s raise ReadTimeoutError( 161s self, url, f"Read timed out. (read timeout={read_timeout})" 161s ) 161s conn.timeout = read_timeout 161s 161s # Receive the response from the server 161s try: 161s response = conn.getresponse() 161s except (BaseSSLError, OSError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 161s raise 161s 161s # Set properties that are used by the pooling layer. 161s response.retries = retries 161s response._connection = response_conn # type: ignore[attr-defined] 161s response._pool = self # type: ignore[attr-defined] 161s 161s log.debug( 161s '%s://%s:%s "%s %s %s" %s %s', 161s self.scheme, 161s self.host, 161s self.port, 161s method, 161s url, 161s > response.version_string, 161s response.status, 161s response.length_remaining, 161s ) 161s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 161s 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 161s ----------------------------- Captured stderr call ----------------------------- 161s 127.0.0.1 - - [18/Jan/2025 02:32:11] "GET /basic-auth/user/passwd HTTP/1.1" 200 46 161s ____________________________ test_auth_failed[http] ____________________________ 161s 161s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_auth_failed_http_0') 161s httpbin_both = 161s verify_pool_mgr = 161s 161s def test_auth_failed(tmpdir, httpbin_both, verify_pool_mgr): 161s """Ensure that we can save failed auth statuses""" 161s auth = ("user", "wrongwrongwrong") 161s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 161s url = httpbin_both.url + "/basic-auth/user/passwd" 161s with vcr.use_cassette(str(tmpdir.join("auth-failed.yaml"))) as cass: 161s # Ensure that this is empty to begin with 161s assert_cassette_empty(cass) 161s > one = verify_pool_mgr.request("GET", url, headers=headers) 161s 161s tests/integration/test_urllib3.py:83: 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 161s return self.request_encode_url( 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 161s return self.urlopen(method, url, **extra_kw) 161s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 161s response = conn.urlopen(method, u.request_uri, **kw) 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 161s response = self._make_request( 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s 161s self = 161s conn = 161s method = 'GET', url = '/basic-auth/user/passwd', body = None 161s headers = {'authorization': 'Basic dXNlcjp3cm9uZ3dyb25nd3Jvbmc='} 161s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 161s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 161s chunked = False, response_conn = None, preload_content = True 161s decode_content = True, enforce_content_length = True 161s 161s def _make_request( 161s self, 161s conn: BaseHTTPConnection, 161s method: str, 161s url: str, 161s body: _TYPE_BODY | None = None, 161s headers: typing.Mapping[str, str] | None = None, 161s retries: Retry | None = None, 161s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 161s chunked: bool = False, 161s response_conn: BaseHTTPConnection | None = None, 161s preload_content: bool = True, 161s decode_content: bool = True, 161s enforce_content_length: bool = True, 161s ) -> BaseHTTPResponse: 161s """ 161s Perform a request on a given urllib connection object taken from our 161s pool. 161s 161s :param conn: 161s a connection from one of our connection pools 161s 161s :param method: 161s HTTP request method (such as GET, POST, PUT, etc.) 161s 161s :param url: 161s The URL to perform the request on. 161s 161s :param body: 161s Data to send in the request body, either :class:`str`, :class:`bytes`, 161s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 161s 161s :param headers: 161s Dictionary of custom headers to send, such as User-Agent, 161s If-None-Match, etc. If None, pool headers are used. If provided, 161s these headers completely replace any pool-specific headers. 161s 161s :param retries: 161s Configure the number of retries to allow before raising a 161s :class:`~urllib3.exceptions.MaxRetryError` exception. 161s 161s Pass ``None`` to retry until you receive a response. Pass a 161s :class:`~urllib3.util.retry.Retry` object for fine-grained control 161s over different types of retries. 161s Pass an integer number to retry connection errors that many times, 161s but no other types of errors. Pass zero to never retry. 161s 161s If ``False``, then retries are disabled and any exception is raised 161s immediately. Also, instead of raising a MaxRetryError on redirects, 161s the redirect response will be returned. 161s 161s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 161s 161s :param timeout: 161s If specified, overrides the default timeout for this one 161s request. It may be a float (in seconds) or an instance of 161s :class:`urllib3.util.Timeout`. 161s 161s :param chunked: 161s If True, urllib3 will send the body using chunked transfer 161s encoding. Otherwise, urllib3 will send the body using the standard 161s content-length form. Defaults to False. 161s 161s :param response_conn: 161s Set this to ``None`` if you will handle releasing the connection or 161s set the connection to have the response release it. 161s 161s :param preload_content: 161s If True, the response's body will be preloaded during construction. 161s 161s :param decode_content: 161s If True, will attempt to decode the body based on the 161s 'content-encoding' header. 161s 161s :param enforce_content_length: 161s Enforce content length checking. Body returned by server must match 161s value of Content-Length header, if present. Otherwise, raise error. 161s """ 161s self.num_requests += 1 161s 161s timeout_obj = self._get_timeout(timeout) 161s timeout_obj.start_connect() 161s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 161s 161s try: 161s # Trigger any extra validation we need to do. 161s try: 161s self._validate_conn(conn) 161s except (SocketTimeout, BaseSSLError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 161s raise 161s 161s # _validate_conn() starts the connection to an HTTPS proxy 161s # so we need to wrap errors with 'ProxyError' here too. 161s except ( 161s OSError, 161s NewConnectionError, 161s TimeoutError, 161s BaseSSLError, 161s CertificateError, 161s SSLError, 161s ) as e: 161s new_e: Exception = e 161s if isinstance(e, (BaseSSLError, CertificateError)): 161s new_e = SSLError(e) 161s # If the connection didn't successfully connect to it's proxy 161s # then there 161s if isinstance( 161s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 161s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 161s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 161s raise new_e 161s 161s # conn.request() calls http.client.*.request, not the method in 161s # urllib3.request. It also calls makefile (recv) on the socket. 161s try: 161s conn.request( 161s method, 161s url, 161s body=body, 161s headers=headers, 161s chunked=chunked, 161s preload_content=preload_content, 161s decode_content=decode_content, 161s enforce_content_length=enforce_content_length, 161s ) 161s 161s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 161s # legitimately able to close the connection after sending a valid response. 161s # With this behaviour, the received response is still readable. 161s except BrokenPipeError: 161s pass 161s except OSError as e: 161s # MacOS/Linux 161s # EPROTOTYPE and ECONNRESET are needed on macOS 161s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 161s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 161s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 161s raise 161s 161s # Reset the timeout for the recv() on the socket 161s read_timeout = timeout_obj.read_timeout 161s 161s if not conn.is_closed: 161s # In Python 3 socket.py will catch EAGAIN and return None when you 161s # try and read into the file pointer created by http.client, which 161s # instead raises a BadStatusLine exception. Instead of catching 161s # the exception and assuming all BadStatusLine exceptions are read 161s # timeouts, check for a zero timeout before making the request. 161s if read_timeout == 0: 161s raise ReadTimeoutError( 161s self, url, f"Read timed out. (read timeout={read_timeout})" 161s ) 161s conn.timeout = read_timeout 161s 161s # Receive the response from the server 161s try: 161s response = conn.getresponse() 161s except (BaseSSLError, OSError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 161s raise 161s 161s # Set properties that are used by the pooling layer. 161s response.retries = retries 161s response._connection = response_conn # type: ignore[attr-defined] 161s response._pool = self # type: ignore[attr-defined] 161s 161s log.debug( 161s '%s://%s:%s "%s %s %s" %s %s', 161s self.scheme, 161s self.host, 161s self.port, 161s method, 161s url, 161s > response.version_string, 161s response.status, 161s response.length_remaining, 161s ) 161s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 161s 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 161s ----------------------------- Captured stderr call ----------------------------- 161s 127.0.0.1 - - [18/Jan/2025 02:32:11] "GET /basic-auth/user/passwd HTTP/1.1" 401 0 161s _______________________________ test_post[http] ________________________________ 161s 161s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_post_http_0') 161s httpbin_both = 161s verify_pool_mgr = 161s 161s def test_post(tmpdir, httpbin_both, verify_pool_mgr): 161s """Ensure that we can post and cache the results""" 161s data = {"key1": "value1", "key2": "value2"} 161s url = httpbin_both.url + "/post" 161s with vcr.use_cassette(str(tmpdir.join("verify_pool_mgr.yaml"))): 161s > req1 = verify_pool_mgr.request("POST", url, data).data 161s 161s tests/integration/test_urllib3.py:94: 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:143: in request 161s return self.request_encode_body( 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:278: in request_encode_body 161s return self.urlopen(method, url, **extra_kw) 161s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 161s response = conn.urlopen(method, u.request_uri, **kw) 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 161s response = self._make_request( 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s 161s self = 161s conn = 161s method = 'POST', url = '/post', body = {'key1': 'value1', 'key2': 'value2'} 161s headers = HTTPHeaderDict({}) 161s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 161s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 161s chunked = False, response_conn = None, preload_content = True 161s decode_content = True, enforce_content_length = True 161s 161s def _make_request( 161s self, 161s conn: BaseHTTPConnection, 161s method: str, 161s url: str, 161s body: _TYPE_BODY | None = None, 161s headers: typing.Mapping[str, str] | None = None, 161s retries: Retry | None = None, 161s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 161s chunked: bool = False, 161s response_conn: BaseHTTPConnection | None = None, 161s preload_content: bool = True, 161s decode_content: bool = True, 161s enforce_content_length: bool = True, 161s ) -> BaseHTTPResponse: 161s """ 161s Perform a request on a given urllib connection object taken from our 161s pool. 161s 161s :param conn: 161s a connection from one of our connection pools 161s 161s :param method: 161s HTTP request method (such as GET, POST, PUT, etc.) 161s 161s :param url: 161s The URL to perform the request on. 161s 161s :param body: 161s Data to send in the request body, either :class:`str`, :class:`bytes`, 161s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 161s 161s :param headers: 161s Dictionary of custom headers to send, such as User-Agent, 161s If-None-Match, etc. If None, pool headers are used. If provided, 161s these headers completely replace any pool-specific headers. 161s 161s :param retries: 161s Configure the number of retries to allow before raising a 161s :class:`~urllib3.exceptions.MaxRetryError` exception. 161s 161s Pass ``None`` to retry until you receive a response. Pass a 161s :class:`~urllib3.util.retry.Retry` object for fine-grained control 161s over different types of retries. 161s Pass an integer number to retry connection errors that many times, 161s but no other types of errors. Pass zero to never retry. 161s 161s If ``False``, then retries are disabled and any exception is raised 161s immediately. Also, instead of raising a MaxRetryError on redirects, 161s the redirect response will be returned. 161s 161s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 161s 161s :param timeout: 161s If specified, overrides the default timeout for this one 161s request. It may be a float (in seconds) or an instance of 161s :class:`urllib3.util.Timeout`. 161s 161s :param chunked: 161s If True, urllib3 will send the body using chunked transfer 161s encoding. Otherwise, urllib3 will send the body using the standard 161s content-length form. Defaults to False. 161s 161s :param response_conn: 161s Set this to ``None`` if you will handle releasing the connection or 161s set the connection to have the response release it. 161s 161s :param preload_content: 161s If True, the response's body will be preloaded during construction. 161s 161s :param decode_content: 161s If True, will attempt to decode the body based on the 161s 'content-encoding' header. 161s 161s :param enforce_content_length: 161s Enforce content length checking. Body returned by server must match 161s value of Content-Length header, if present. Otherwise, raise error. 161s """ 161s self.num_requests += 1 161s 161s timeout_obj = self._get_timeout(timeout) 161s timeout_obj.start_connect() 161s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 161s 161s try: 161s # Trigger any extra validation we need to do. 161s try: 161s self._validate_conn(conn) 161s except (SocketTimeout, BaseSSLError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 161s raise 161s 161s # _validate_conn() starts the connection to an HTTPS proxy 161s # so we need to wrap errors with 'ProxyError' here too. 161s except ( 161s OSError, 161s NewConnectionError, 161s TimeoutError, 161s BaseSSLError, 161s CertificateError, 161s SSLError, 161s ) as e: 161s new_e: Exception = e 161s if isinstance(e, (BaseSSLError, CertificateError)): 161s new_e = SSLError(e) 161s # If the connection didn't successfully connect to it's proxy 161s # then there 161s if isinstance( 161s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 161s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 161s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 161s raise new_e 161s 161s # conn.request() calls http.client.*.request, not the method in 161s # urllib3.request. It also calls makefile (recv) on the socket. 161s try: 161s conn.request( 161s method, 161s url, 161s body=body, 161s headers=headers, 161s chunked=chunked, 161s preload_content=preload_content, 161s decode_content=decode_content, 161s enforce_content_length=enforce_content_length, 161s ) 161s 161s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 161s # legitimately able to close the connection after sending a valid response. 161s # With this behaviour, the received response is still readable. 161s except BrokenPipeError: 161s pass 161s except OSError as e: 161s # MacOS/Linux 161s # EPROTOTYPE and ECONNRESET are needed on macOS 161s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 161s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 161s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 161s raise 161s 161s # Reset the timeout for the recv() on the socket 161s read_timeout = timeout_obj.read_timeout 161s 161s if not conn.is_closed: 161s # In Python 3 socket.py will catch EAGAIN and return None when you 161s # try and read into the file pointer created by http.client, which 161s # instead raises a BadStatusLine exception. Instead of catching 161s # the exception and assuming all BadStatusLine exceptions are read 161s # timeouts, check for a zero timeout before making the request. 161s if read_timeout == 0: 161s raise ReadTimeoutError( 161s self, url, f"Read timed out. (read timeout={read_timeout})" 161s ) 161s conn.timeout = read_timeout 161s 161s # Receive the response from the server 161s try: 161s response = conn.getresponse() 161s except (BaseSSLError, OSError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 161s raise 161s 161s # Set properties that are used by the pooling layer. 161s response.retries = retries 161s response._connection = response_conn # type: ignore[attr-defined] 161s response._pool = self # type: ignore[attr-defined] 161s 161s log.debug( 161s '%s://%s:%s "%s %s %s" %s %s', 161s self.scheme, 161s self.host, 161s self.port, 161s method, 161s url, 161s > response.version_string, 161s response.status, 161s response.length_remaining, 161s ) 161s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 161s 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 161s ----------------------------- Captured stderr call ----------------------------- 161s 127.0.0.1 - - [18/Jan/2025 02:32:11] "POST /post HTTP/1.1" 501 159 161s _______________________________ test_gzip[http] ________________________________ 161s 161s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_gzip_http_0') 161s httpbin_both = 161s verify_pool_mgr = 161s 161s def test_gzip(tmpdir, httpbin_both, verify_pool_mgr): 161s """ 161s Ensure that requests (actually urllib3) is able to automatically decompress 161s the response body 161s """ 161s url = httpbin_both.url + "/gzip" 161s response = verify_pool_mgr.request("GET", url) 161s 161s with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))): 161s > response = verify_pool_mgr.request("GET", url) 161s 161s tests/integration/test_urllib3.py:140: 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 161s return self.request_encode_url( 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 161s return self.urlopen(method, url, **extra_kw) 161s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 161s response = conn.urlopen(method, u.request_uri, **kw) 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 161s response = self._make_request( 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s 161s self = 161s conn = 161s method = 'GET', url = '/gzip', body = None, headers = {} 161s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 161s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 161s chunked = False, response_conn = None, preload_content = True 161s decode_content = True, enforce_content_length = True 161s 161s def _make_request( 161s self, 161s conn: BaseHTTPConnection, 161s method: str, 161s url: str, 161s body: _TYPE_BODY | None = None, 161s headers: typing.Mapping[str, str] | None = None, 161s retries: Retry | None = None, 161s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 161s chunked: bool = False, 161s response_conn: BaseHTTPConnection | None = None, 161s preload_content: bool = True, 161s decode_content: bool = True, 161s enforce_content_length: bool = True, 161s ) -> BaseHTTPResponse: 161s """ 161s Perform a request on a given urllib connection object taken from our 161s pool. 161s 161s :param conn: 161s a connection from one of our connection pools 161s 161s :param method: 161s HTTP request method (such as GET, POST, PUT, etc.) 161s 161s :param url: 161s The URL to perform the request on. 161s 161s :param body: 161s Data to send in the request body, either :class:`str`, :class:`bytes`, 161s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 161s 161s :param headers: 161s Dictionary of custom headers to send, such as User-Agent, 161s If-None-Match, etc. If None, pool headers are used. If provided, 161s these headers completely replace any pool-specific headers. 161s 161s :param retries: 161s Configure the number of retries to allow before raising a 161s :class:`~urllib3.exceptions.MaxRetryError` exception. 161s 161s Pass ``None`` to retry until you receive a response. Pass a 161s :class:`~urllib3.util.retry.Retry` object for fine-grained control 161s over different types of retries. 161s Pass an integer number to retry connection errors that many times, 161s but no other types of errors. Pass zero to never retry. 161s 161s If ``False``, then retries are disabled and any exception is raised 161s immediately. Also, instead of raising a MaxRetryError on redirects, 161s the redirect response will be returned. 161s 161s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 161s 161s :param timeout: 161s If specified, overrides the default timeout for this one 161s request. It may be a float (in seconds) or an instance of 161s :class:`urllib3.util.Timeout`. 161s 161s :param chunked: 161s If True, urllib3 will send the body using chunked transfer 161s encoding. Otherwise, urllib3 will send the body using the standard 161s content-length form. Defaults to False. 161s 161s :param response_conn: 161s Set this to ``None`` if you will handle releasing the connection or 161s set the connection to have the response release it. 161s 161s :param preload_content: 161s If True, the response's body will be preloaded during construction. 161s 161s :param decode_content: 161s If True, will attempt to decode the body based on the 161s 'content-encoding' header. 161s 161s :param enforce_content_length: 161s Enforce content length checking. Body returned by server must match 161s value of Content-Length header, if present. Otherwise, raise error. 161s """ 161s self.num_requests += 1 161s 161s timeout_obj = self._get_timeout(timeout) 161s timeout_obj.start_connect() 161s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 161s 161s try: 161s # Trigger any extra validation we need to do. 161s try: 161s self._validate_conn(conn) 161s except (SocketTimeout, BaseSSLError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 161s raise 161s 161s # _validate_conn() starts the connection to an HTTPS proxy 161s # so we need to wrap errors with 'ProxyError' here too. 161s except ( 161s OSError, 161s NewConnectionError, 161s TimeoutError, 161s BaseSSLError, 161s CertificateError, 161s SSLError, 161s ) as e: 161s new_e: Exception = e 161s if isinstance(e, (BaseSSLError, CertificateError)): 161s new_e = SSLError(e) 161s # If the connection didn't successfully connect to it's proxy 161s # then there 161s if isinstance( 161s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 161s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 161s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 161s raise new_e 161s 161s # conn.request() calls http.client.*.request, not the method in 161s # urllib3.request. It also calls makefile (recv) on the socket. 161s try: 161s conn.request( 161s method, 161s url, 161s body=body, 161s headers=headers, 161s chunked=chunked, 161s preload_content=preload_content, 161s decode_content=decode_content, 161s enforce_content_length=enforce_content_length, 161s ) 161s 161s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 161s # legitimately able to close the connection after sending a valid response. 161s # With this behaviour, the received response is still readable. 161s except BrokenPipeError: 161s pass 161s except OSError as e: 161s # MacOS/Linux 161s # EPROTOTYPE and ECONNRESET are needed on macOS 161s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 161s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 161s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 161s raise 161s 161s # Reset the timeout for the recv() on the socket 161s read_timeout = timeout_obj.read_timeout 161s 161s if not conn.is_closed: 161s # In Python 3 socket.py will catch EAGAIN and return None when you 161s # try and read into the file pointer created by http.client, which 161s # instead raises a BadStatusLine exception. Instead of catching 161s # the exception and assuming all BadStatusLine exceptions are read 161s # timeouts, check for a zero timeout before making the request. 161s if read_timeout == 0: 161s raise ReadTimeoutError( 161s self, url, f"Read timed out. (read timeout={read_timeout})" 161s ) 161s conn.timeout = read_timeout 161s 161s # Receive the response from the server 161s try: 161s response = conn.getresponse() 161s except (BaseSSLError, OSError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 161s raise 161s 161s # Set properties that are used by the pooling layer. 161s response.retries = retries 161s response._connection = response_conn # type: ignore[attr-defined] 161s response._pool = self # type: ignore[attr-defined] 161s 161s log.debug( 161s '%s://%s:%s "%s %s %s" %s %s', 161s self.scheme, 161s self.host, 161s self.port, 161s method, 161s url, 161s > response.version_string, 161s response.status, 161s response.length_remaining, 161s ) 161s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 161s 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 161s ----------------------------- Captured stderr call ----------------------------- 161s 127.0.0.1 - - [18/Jan/2025 02:32:11] "GET /gzip HTTP/1.1" 200 165 161s 127.0.0.1 - - [18/Jan/2025 02:32:11] "GET /gzip HTTP/1.1" 200 165 161s ___________________________ test_status_code[https] ____________________________ 161s 161s httpbin_both = 161s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_status_code_https_0') 161s verify_pool_mgr = 161s 161s def test_status_code(httpbin_both, tmpdir, verify_pool_mgr): 161s """Ensure that we can read the status code""" 161s url = httpbin_both.url 161s with vcr.use_cassette(str(tmpdir.join("atts.yaml"))): 161s > status_code = verify_pool_mgr.request("GET", url).status 161s 161s tests/integration/test_urllib3.py:34: 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 161s return self.request_encode_url( 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 161s return self.urlopen(method, url, **extra_kw) 161s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 161s response = conn.urlopen(method, u.request_uri, **kw) 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 161s response = self._make_request( 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s 161s self = 161s conn = 161s method = 'GET', url = '/', body = None, headers = {} 161s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 161s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 161s chunked = False, response_conn = None, preload_content = True 161s decode_content = True, enforce_content_length = True 161s 161s def _make_request( 161s self, 161s conn: BaseHTTPConnection, 161s method: str, 161s url: str, 161s body: _TYPE_BODY | None = None, 161s headers: typing.Mapping[str, str] | None = None, 161s retries: Retry | None = None, 161s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 161s chunked: bool = False, 161s response_conn: BaseHTTPConnection | None = None, 161s preload_content: bool = True, 161s decode_content: bool = True, 161s enforce_content_length: bool = True, 161s ) -> BaseHTTPResponse: 161s """ 161s Perform a request on a given urllib connection object taken from our 161s pool. 161s 161s :param conn: 161s a connection from one of our connection pools 161s 161s :param method: 161s HTTP request method (such as GET, POST, PUT, etc.) 161s 161s :param url: 161s The URL to perform the request on. 161s 161s :param body: 161s Data to send in the request body, either :class:`str`, :class:`bytes`, 161s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 161s 161s :param headers: 161s Dictionary of custom headers to send, such as User-Agent, 161s If-None-Match, etc. If None, pool headers are used. If provided, 161s these headers completely replace any pool-specific headers. 161s 161s :param retries: 161s Configure the number of retries to allow before raising a 161s :class:`~urllib3.exceptions.MaxRetryError` exception. 161s 161s Pass ``None`` to retry until you receive a response. Pass a 161s :class:`~urllib3.util.retry.Retry` object for fine-grained control 161s over different types of retries. 161s Pass an integer number to retry connection errors that many times, 161s but no other types of errors. Pass zero to never retry. 161s 161s If ``False``, then retries are disabled and any exception is raised 161s immediately. Also, instead of raising a MaxRetryError on redirects, 161s the redirect response will be returned. 161s 161s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 161s 161s :param timeout: 161s If specified, overrides the default timeout for this one 161s request. It may be a float (in seconds) or an instance of 161s :class:`urllib3.util.Timeout`. 161s 161s :param chunked: 161s If True, urllib3 will send the body using chunked transfer 161s encoding. Otherwise, urllib3 will send the body using the standard 161s content-length form. Defaults to False. 161s 161s :param response_conn: 161s Set this to ``None`` if you will handle releasing the connection or 161s set the connection to have the response release it. 161s 161s :param preload_content: 161s If True, the response's body will be preloaded during construction. 161s 161s :param decode_content: 161s If True, will attempt to decode the body based on the 161s 'content-encoding' header. 161s 161s :param enforce_content_length: 161s Enforce content length checking. Body returned by server must match 161s value of Content-Length header, if present. Otherwise, raise error. 161s """ 161s self.num_requests += 1 161s 161s timeout_obj = self._get_timeout(timeout) 161s timeout_obj.start_connect() 161s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 161s 161s try: 161s # Trigger any extra validation we need to do. 161s try: 161s self._validate_conn(conn) 161s except (SocketTimeout, BaseSSLError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 161s raise 161s 161s # _validate_conn() starts the connection to an HTTPS proxy 161s # so we need to wrap errors with 'ProxyError' here too. 161s except ( 161s OSError, 161s NewConnectionError, 161s TimeoutError, 161s BaseSSLError, 161s CertificateError, 161s SSLError, 161s ) as e: 161s new_e: Exception = e 161s if isinstance(e, (BaseSSLError, CertificateError)): 161s new_e = SSLError(e) 161s # If the connection didn't successfully connect to it's proxy 161s # then there 161s if isinstance( 161s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 161s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 161s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 161s raise new_e 161s 161s # conn.request() calls http.client.*.request, not the method in 161s # urllib3.request. It also calls makefile (recv) on the socket. 161s try: 161s conn.request( 161s method, 161s url, 161s body=body, 161s headers=headers, 161s chunked=chunked, 161s preload_content=preload_content, 161s decode_content=decode_content, 161s enforce_content_length=enforce_content_length, 161s ) 161s 161s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 161s # legitimately able to close the connection after sending a valid response. 161s # With this behaviour, the received response is still readable. 161s except BrokenPipeError: 161s pass 161s except OSError as e: 161s # MacOS/Linux 161s # EPROTOTYPE and ECONNRESET are needed on macOS 161s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 161s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 161s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 161s raise 161s 161s # Reset the timeout for the recv() on the socket 161s read_timeout = timeout_obj.read_timeout 161s 161s if not conn.is_closed: 161s # In Python 3 socket.py will catch EAGAIN and return None when you 161s # try and read into the file pointer created by http.client, which 161s # instead raises a BadStatusLine exception. Instead of catching 161s # the exception and assuming all BadStatusLine exceptions are read 161s # timeouts, check for a zero timeout before making the request. 161s if read_timeout == 0: 161s raise ReadTimeoutError( 161s self, url, f"Read timed out. (read timeout={read_timeout})" 161s ) 161s conn.timeout = read_timeout 161s 161s # Receive the response from the server 161s try: 161s response = conn.getresponse() 161s except (BaseSSLError, OSError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 161s raise 161s 161s # Set properties that are used by the pooling layer. 161s response.retries = retries 161s response._connection = response_conn # type: ignore[attr-defined] 161s response._pool = self # type: ignore[attr-defined] 161s 161s log.debug( 161s '%s://%s:%s "%s %s %s" %s %s', 161s self.scheme, 161s self.host, 161s self.port, 161s method, 161s url, 161s > response.version_string, 161s response.status, 161s response.length_remaining, 161s ) 161s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 161s 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 161s ----------------------------- Captured stderr call ----------------------------- 161s 127.0.0.1 - - [18/Jan/2025 02:32:11] "GET / HTTP/1.1" 200 9358 161s _____________________________ test_headers[https] ______________________________ 161s 161s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_headers_https_0') 161s httpbin_both = 161s verify_pool_mgr = 161s 161s def test_headers(tmpdir, httpbin_both, verify_pool_mgr): 161s """Ensure that we can read the headers back""" 161s url = httpbin_both.url 161s with vcr.use_cassette(str(tmpdir.join("headers.yaml"))): 161s > headers = verify_pool_mgr.request("GET", url).headers 161s 161s tests/integration/test_urllib3.py:44: 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 161s return self.request_encode_url( 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 161s return self.urlopen(method, url, **extra_kw) 161s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 161s response = conn.urlopen(method, u.request_uri, **kw) 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 161s response = self._make_request( 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s 161s self = 161s conn = 161s method = 'GET', url = '/', body = None, headers = {} 161s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 161s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 161s chunked = False, response_conn = None, preload_content = True 161s decode_content = True, enforce_content_length = True 161s 161s def _make_request( 161s self, 161s conn: BaseHTTPConnection, 161s method: str, 161s url: str, 161s body: _TYPE_BODY | None = None, 161s headers: typing.Mapping[str, str] | None = None, 161s retries: Retry | None = None, 161s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 161s chunked: bool = False, 161s response_conn: BaseHTTPConnection | None = None, 161s preload_content: bool = True, 161s decode_content: bool = True, 161s enforce_content_length: bool = True, 161s ) -> BaseHTTPResponse: 161s """ 161s Perform a request on a given urllib connection object taken from our 161s pool. 161s 161s :param conn: 161s a connection from one of our connection pools 161s 161s :param method: 161s HTTP request method (such as GET, POST, PUT, etc.) 161s 161s :param url: 161s The URL to perform the request on. 161s 161s :param body: 161s Data to send in the request body, either :class:`str`, :class:`bytes`, 161s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 161s 161s :param headers: 161s Dictionary of custom headers to send, such as User-Agent, 161s If-None-Match, etc. If None, pool headers are used. If provided, 161s these headers completely replace any pool-specific headers. 161s 161s :param retries: 161s Configure the number of retries to allow before raising a 161s :class:`~urllib3.exceptions.MaxRetryError` exception. 161s 161s Pass ``None`` to retry until you receive a response. Pass a 161s :class:`~urllib3.util.retry.Retry` object for fine-grained control 161s over different types of retries. 161s Pass an integer number to retry connection errors that many times, 161s but no other types of errors. Pass zero to never retry. 161s 161s If ``False``, then retries are disabled and any exception is raised 161s immediately. Also, instead of raising a MaxRetryError on redirects, 161s the redirect response will be returned. 161s 161s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 161s 161s :param timeout: 161s If specified, overrides the default timeout for this one 161s request. It may be a float (in seconds) or an instance of 161s :class:`urllib3.util.Timeout`. 161s 161s :param chunked: 161s If True, urllib3 will send the body using chunked transfer 161s encoding. Otherwise, urllib3 will send the body using the standard 161s content-length form. Defaults to False. 161s 161s :param response_conn: 161s Set this to ``None`` if you will handle releasing the connection or 161s set the connection to have the response release it. 161s 161s :param preload_content: 161s If True, the response's body will be preloaded during construction. 161s 161s :param decode_content: 161s If True, will attempt to decode the body based on the 161s 'content-encoding' header. 161s 161s :param enforce_content_length: 161s Enforce content length checking. Body returned by server must match 161s value of Content-Length header, if present. Otherwise, raise error. 161s """ 161s self.num_requests += 1 161s 161s timeout_obj = self._get_timeout(timeout) 161s timeout_obj.start_connect() 161s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 161s 161s try: 161s # Trigger any extra validation we need to do. 161s try: 161s self._validate_conn(conn) 161s except (SocketTimeout, BaseSSLError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 161s raise 161s 161s # _validate_conn() starts the connection to an HTTPS proxy 161s # so we need to wrap errors with 'ProxyError' here too. 161s except ( 161s OSError, 161s NewConnectionError, 161s TimeoutError, 161s BaseSSLError, 161s CertificateError, 161s SSLError, 161s ) as e: 161s new_e: Exception = e 161s if isinstance(e, (BaseSSLError, CertificateError)): 161s new_e = SSLError(e) 161s # If the connection didn't successfully connect to it's proxy 161s # then there 161s if isinstance( 161s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 161s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 161s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 161s raise new_e 161s 161s # conn.request() calls http.client.*.request, not the method in 161s # urllib3.request. It also calls makefile (recv) on the socket. 161s try: 161s conn.request( 161s method, 161s url, 161s body=body, 161s headers=headers, 161s chunked=chunked, 161s preload_content=preload_content, 161s decode_content=decode_content, 161s enforce_content_length=enforce_content_length, 161s ) 161s 161s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 161s # legitimately able to close the connection after sending a valid response. 161s # With this behaviour, the received response is still readable. 161s except BrokenPipeError: 161s pass 161s except OSError as e: 161s # MacOS/Linux 161s # EPROTOTYPE and ECONNRESET are needed on macOS 161s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 161s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 161s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 161s raise 161s 161s # Reset the timeout for the recv() on the socket 161s read_timeout = timeout_obj.read_timeout 161s 161s if not conn.is_closed: 161s # In Python 3 socket.py will catch EAGAIN and return None when you 161s # try and read into the file pointer created by http.client, which 161s # instead raises a BadStatusLine exception. Instead of catching 161s # the exception and assuming all BadStatusLine exceptions are read 161s # timeouts, check for a zero timeout before making the request. 161s if read_timeout == 0: 161s raise ReadTimeoutError( 161s self, url, f"Read timed out. (read timeout={read_timeout})" 161s ) 161s conn.timeout = read_timeout 161s 161s # Receive the response from the server 161s try: 161s response = conn.getresponse() 161s except (BaseSSLError, OSError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 161s raise 161s 161s # Set properties that are used by the pooling layer. 161s response.retries = retries 161s response._connection = response_conn # type: ignore[attr-defined] 161s response._pool = self # type: ignore[attr-defined] 161s 161s log.debug( 161s '%s://%s:%s "%s %s %s" %s %s', 161s self.scheme, 161s self.host, 161s self.port, 161s method, 161s url, 161s > response.version_string, 161s response.status, 161s response.length_remaining, 161s ) 161s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 161s 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 161s ----------------------------- Captured stderr call ----------------------------- 161s 127.0.0.1 - - [18/Jan/2025 02:32:11] "GET / HTTP/1.1" 200 9358 161s _______________________________ test_body[https] _______________________________ 161s 161s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_body_https_0') 161s httpbin_both = 161s verify_pool_mgr = 161s 161s def test_body(tmpdir, httpbin_both, verify_pool_mgr): 161s """Ensure the responses are all identical enough""" 161s url = httpbin_both.url + "/bytes/1024" 161s with vcr.use_cassette(str(tmpdir.join("body.yaml"))): 161s > content = verify_pool_mgr.request("GET", url).data 161s 161s tests/integration/test_urllib3.py:55: 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 161s return self.request_encode_url( 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 161s return self.urlopen(method, url, **extra_kw) 161s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 161s response = conn.urlopen(method, u.request_uri, **kw) 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 161s response = self._make_request( 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s 161s self = 161s conn = 161s method = 'GET', url = '/bytes/1024', body = None, headers = {} 161s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 161s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 161s chunked = False, response_conn = None, preload_content = True 161s decode_content = True, enforce_content_length = True 161s 161s def _make_request( 161s self, 161s conn: BaseHTTPConnection, 161s method: str, 161s url: str, 161s body: _TYPE_BODY | None = None, 161s headers: typing.Mapping[str, str] | None = None, 161s retries: Retry | None = None, 161s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 161s chunked: bool = False, 161s response_conn: BaseHTTPConnection | None = None, 161s preload_content: bool = True, 161s decode_content: bool = True, 161s enforce_content_length: bool = True, 161s ) -> BaseHTTPResponse: 161s """ 161s Perform a request on a given urllib connection object taken from our 161s pool. 161s 161s :param conn: 161s a connection from one of our connection pools 161s 161s :param method: 161s HTTP request method (such as GET, POST, PUT, etc.) 161s 161s :param url: 161s The URL to perform the request on. 161s 161s :param body: 161s Data to send in the request body, either :class:`str`, :class:`bytes`, 161s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 161s 161s :param headers: 161s Dictionary of custom headers to send, such as User-Agent, 161s If-None-Match, etc. If None, pool headers are used. If provided, 161s these headers completely replace any pool-specific headers. 161s 161s :param retries: 161s Configure the number of retries to allow before raising a 161s :class:`~urllib3.exceptions.MaxRetryError` exception. 161s 161s Pass ``None`` to retry until you receive a response. Pass a 161s :class:`~urllib3.util.retry.Retry` object for fine-grained control 161s over different types of retries. 161s Pass an integer number to retry connection errors that many times, 161s but no other types of errors. Pass zero to never retry. 161s 161s If ``False``, then retries are disabled and any exception is raised 161s immediately. Also, instead of raising a MaxRetryError on redirects, 161s the redirect response will be returned. 161s 161s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 161s 161s :param timeout: 161s If specified, overrides the default timeout for this one 161s request. It may be a float (in seconds) or an instance of 161s :class:`urllib3.util.Timeout`. 161s 161s :param chunked: 161s If True, urllib3 will send the body using chunked transfer 161s encoding. Otherwise, urllib3 will send the body using the standard 161s content-length form. Defaults to False. 161s 161s :param response_conn: 161s Set this to ``None`` if you will handle releasing the connection or 161s set the connection to have the response release it. 161s 161s :param preload_content: 161s If True, the response's body will be preloaded during construction. 161s 161s :param decode_content: 161s If True, will attempt to decode the body based on the 161s 'content-encoding' header. 161s 161s :param enforce_content_length: 161s Enforce content length checking. Body returned by server must match 161s value of Content-Length header, if present. Otherwise, raise error. 161s """ 161s self.num_requests += 1 161s 161s timeout_obj = self._get_timeout(timeout) 161s timeout_obj.start_connect() 161s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 161s 161s try: 161s # Trigger any extra validation we need to do. 161s try: 161s self._validate_conn(conn) 161s except (SocketTimeout, BaseSSLError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 161s raise 161s 161s # _validate_conn() starts the connection to an HTTPS proxy 161s # so we need to wrap errors with 'ProxyError' here too. 161s except ( 161s OSError, 161s NewConnectionError, 161s TimeoutError, 161s BaseSSLError, 161s CertificateError, 161s SSLError, 161s ) as e: 161s new_e: Exception = e 161s if isinstance(e, (BaseSSLError, CertificateError)): 161s new_e = SSLError(e) 161s # If the connection didn't successfully connect to it's proxy 161s # then there 161s if isinstance( 161s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 161s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 161s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 161s raise new_e 161s 161s # conn.request() calls http.client.*.request, not the method in 161s # urllib3.request. It also calls makefile (recv) on the socket. 161s try: 161s conn.request( 161s method, 161s url, 161s body=body, 161s headers=headers, 161s chunked=chunked, 161s preload_content=preload_content, 161s decode_content=decode_content, 161s enforce_content_length=enforce_content_length, 161s ) 161s 161s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 161s # legitimately able to close the connection after sending a valid response. 161s # With this behaviour, the received response is still readable. 161s except BrokenPipeError: 161s pass 161s except OSError as e: 161s # MacOS/Linux 161s # EPROTOTYPE and ECONNRESET are needed on macOS 161s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 161s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 161s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 161s raise 161s 161s # Reset the timeout for the recv() on the socket 161s read_timeout = timeout_obj.read_timeout 161s 161s if not conn.is_closed: 161s # In Python 3 socket.py will catch EAGAIN and return None when you 161s # try and read into the file pointer created by http.client, which 161s # instead raises a BadStatusLine exception. Instead of catching 161s # the exception and assuming all BadStatusLine exceptions are read 161s # timeouts, check for a zero timeout before making the request. 161s if read_timeout == 0: 161s raise ReadTimeoutError( 161s self, url, f"Read timed out. (read timeout={read_timeout})" 161s ) 161s conn.timeout = read_timeout 161s 161s # Receive the response from the server 161s try: 161s response = conn.getresponse() 161s except (BaseSSLError, OSError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 161s raise 161s 161s # Set properties that are used by the pooling layer. 161s response.retries = retries 161s response._connection = response_conn # type: ignore[attr-defined] 161s response._pool = self # type: ignore[attr-defined] 161s 161s log.debug( 161s '%s://%s:%s "%s %s %s" %s %s', 161s self.scheme, 161s self.host, 161s self.port, 161s method, 161s url, 161s > response.version_string, 161s response.status, 161s response.length_remaining, 161s ) 161s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 161s 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 161s ----------------------------- Captured stderr call ----------------------------- 161s 127.0.0.1 - - [18/Jan/2025 02:32:11] "GET /bytes/1024 HTTP/1.1" 200 1024 161s _______________________________ test_auth[https] _______________________________ 161s 161s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_auth_https_0') 161s httpbin_both = 161s verify_pool_mgr = 161s 161s def test_auth(tmpdir, httpbin_both, verify_pool_mgr): 161s """Ensure that we can handle basic auth""" 161s auth = ("user", "passwd") 161s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 161s url = httpbin_both.url + "/basic-auth/user/passwd" 161s with vcr.use_cassette(str(tmpdir.join("auth.yaml"))): 161s > one = verify_pool_mgr.request("GET", url, headers=headers) 161s 161s tests/integration/test_urllib3.py:67: 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 161s return self.request_encode_url( 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 161s return self.urlopen(method, url, **extra_kw) 161s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 161s response = conn.urlopen(method, u.request_uri, **kw) 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 161s response = self._make_request( 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s 161s self = 161s conn = 161s method = 'GET', url = '/basic-auth/user/passwd', body = None 161s headers = {'authorization': 'Basic dXNlcjpwYXNzd2Q='} 161s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 161s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 161s chunked = False, response_conn = None, preload_content = True 161s decode_content = True, enforce_content_length = True 161s 161s def _make_request( 161s self, 161s conn: BaseHTTPConnection, 161s method: str, 161s url: str, 161s body: _TYPE_BODY | None = None, 161s headers: typing.Mapping[str, str] | None = None, 161s retries: Retry | None = None, 161s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 161s chunked: bool = False, 161s response_conn: BaseHTTPConnection | None = None, 161s preload_content: bool = True, 161s decode_content: bool = True, 161s enforce_content_length: bool = True, 161s ) -> BaseHTTPResponse: 161s """ 161s Perform a request on a given urllib connection object taken from our 161s pool. 161s 161s :param conn: 161s a connection from one of our connection pools 161s 161s :param method: 161s HTTP request method (such as GET, POST, PUT, etc.) 161s 161s :param url: 161s The URL to perform the request on. 161s 161s :param body: 161s Data to send in the request body, either :class:`str`, :class:`bytes`, 161s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 161s 161s :param headers: 161s Dictionary of custom headers to send, such as User-Agent, 161s If-None-Match, etc. If None, pool headers are used. If provided, 161s these headers completely replace any pool-specific headers. 161s 161s :param retries: 161s Configure the number of retries to allow before raising a 161s :class:`~urllib3.exceptions.MaxRetryError` exception. 161s 161s Pass ``None`` to retry until you receive a response. Pass a 161s :class:`~urllib3.util.retry.Retry` object for fine-grained control 161s over different types of retries. 161s Pass an integer number to retry connection errors that many times, 161s but no other types of errors. Pass zero to never retry. 161s 161s If ``False``, then retries are disabled and any exception is raised 161s immediately. Also, instead of raising a MaxRetryError on redirects, 161s the redirect response will be returned. 161s 161s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 161s 161s :param timeout: 161s If specified, overrides the default timeout for this one 161s request. It may be a float (in seconds) or an instance of 161s :class:`urllib3.util.Timeout`. 161s 161s :param chunked: 161s If True, urllib3 will send the body using chunked transfer 161s encoding. Otherwise, urllib3 will send the body using the standard 161s content-length form. Defaults to False. 161s 161s :param response_conn: 161s Set this to ``None`` if you will handle releasing the connection or 161s set the connection to have the response release it. 161s 161s :param preload_content: 161s If True, the response's body will be preloaded during construction. 161s 161s :param decode_content: 161s If True, will attempt to decode the body based on the 161s 'content-encoding' header. 161s 161s :param enforce_content_length: 161s Enforce content length checking. Body returned by server must match 161s value of Content-Length header, if present. Otherwise, raise error. 161s """ 161s self.num_requests += 1 161s 161s timeout_obj = self._get_timeout(timeout) 161s timeout_obj.start_connect() 161s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 161s 161s try: 161s # Trigger any extra validation we need to do. 161s try: 161s self._validate_conn(conn) 161s except (SocketTimeout, BaseSSLError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 161s raise 161s 161s # _validate_conn() starts the connection to an HTTPS proxy 161s # so we need to wrap errors with 'ProxyError' here too. 161s except ( 161s OSError, 161s NewConnectionError, 161s TimeoutError, 161s BaseSSLError, 161s CertificateError, 161s SSLError, 161s ) as e: 161s new_e: Exception = e 161s if isinstance(e, (BaseSSLError, CertificateError)): 161s new_e = SSLError(e) 161s # If the connection didn't successfully connect to it's proxy 161s # then there 161s if isinstance( 161s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 161s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 161s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 161s raise new_e 161s 161s # conn.request() calls http.client.*.request, not the method in 161s # urllib3.request. It also calls makefile (recv) on the socket. 161s try: 161s conn.request( 161s method, 161s url, 161s body=body, 161s headers=headers, 161s chunked=chunked, 161s preload_content=preload_content, 161s decode_content=decode_content, 161s enforce_content_length=enforce_content_length, 161s ) 161s 161s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 161s # legitimately able to close the connection after sending a valid response. 161s # With this behaviour, the received response is still readable. 161s except BrokenPipeError: 161s pass 161s except OSError as e: 161s # MacOS/Linux 161s # EPROTOTYPE and ECONNRESET are needed on macOS 161s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 161s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 161s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 161s raise 161s 161s # Reset the timeout for the recv() on the socket 161s read_timeout = timeout_obj.read_timeout 161s 161s if not conn.is_closed: 161s # In Python 3 socket.py will catch EAGAIN and return None when you 161s # try and read into the file pointer created by http.client, which 161s # instead raises a BadStatusLine exception. Instead of catching 161s # the exception and assuming all BadStatusLine exceptions are read 161s # timeouts, check for a zero timeout before making the request. 161s if read_timeout == 0: 161s raise ReadTimeoutError( 161s self, url, f"Read timed out. (read timeout={read_timeout})" 161s ) 161s conn.timeout = read_timeout 161s 161s # Receive the response from the server 161s try: 161s response = conn.getresponse() 161s except (BaseSSLError, OSError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 161s raise 161s 161s # Set properties that are used by the pooling layer. 161s response.retries = retries 161s response._connection = response_conn # type: ignore[attr-defined] 161s response._pool = self # type: ignore[attr-defined] 161s 161s log.debug( 161s '%s://%s:%s "%s %s %s" %s %s', 161s self.scheme, 161s self.host, 161s self.port, 161s method, 161s url, 161s > response.version_string, 161s response.status, 161s response.length_remaining, 161s ) 161s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 161s 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 161s ----------------------------- Captured stderr call ----------------------------- 161s 127.0.0.1 - - [18/Jan/2025 02:32:11] "GET /basic-auth/user/passwd HTTP/1.1" 200 46 161s ___________________________ test_auth_failed[https] ____________________________ 161s 161s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_auth_failed_https_0') 161s httpbin_both = 161s verify_pool_mgr = 161s 161s def test_auth_failed(tmpdir, httpbin_both, verify_pool_mgr): 161s """Ensure that we can save failed auth statuses""" 161s auth = ("user", "wrongwrongwrong") 161s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 161s url = httpbin_both.url + "/basic-auth/user/passwd" 161s with vcr.use_cassette(str(tmpdir.join("auth-failed.yaml"))) as cass: 161s # Ensure that this is empty to begin with 161s assert_cassette_empty(cass) 161s > one = verify_pool_mgr.request("GET", url, headers=headers) 161s 161s tests/integration/test_urllib3.py:83: 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 161s return self.request_encode_url( 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 161s return self.urlopen(method, url, **extra_kw) 161s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 161s response = conn.urlopen(method, u.request_uri, **kw) 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 161s response = self._make_request( 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s 161s self = 161s conn = 161s method = 'GET', url = '/basic-auth/user/passwd', body = None 161s headers = {'authorization': 'Basic dXNlcjp3cm9uZ3dyb25nd3Jvbmc='} 161s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 161s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 161s chunked = False, response_conn = None, preload_content = True 161s decode_content = True, enforce_content_length = True 161s 161s def _make_request( 161s self, 161s conn: BaseHTTPConnection, 161s method: str, 161s url: str, 161s body: _TYPE_BODY | None = None, 161s headers: typing.Mapping[str, str] | None = None, 161s retries: Retry | None = None, 161s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 161s chunked: bool = False, 161s response_conn: BaseHTTPConnection | None = None, 161s preload_content: bool = True, 161s decode_content: bool = True, 161s enforce_content_length: bool = True, 161s ) -> BaseHTTPResponse: 161s """ 161s Perform a request on a given urllib connection object taken from our 161s pool. 161s 161s :param conn: 161s a connection from one of our connection pools 161s 161s :param method: 161s HTTP request method (such as GET, POST, PUT, etc.) 161s 161s :param url: 161s The URL to perform the request on. 161s 161s :param body: 161s Data to send in the request body, either :class:`str`, :class:`bytes`, 161s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 161s 161s :param headers: 161s Dictionary of custom headers to send, such as User-Agent, 161s If-None-Match, etc. If None, pool headers are used. If provided, 161s these headers completely replace any pool-specific headers. 161s 161s :param retries: 161s Configure the number of retries to allow before raising a 161s :class:`~urllib3.exceptions.MaxRetryError` exception. 161s 161s Pass ``None`` to retry until you receive a response. Pass a 161s :class:`~urllib3.util.retry.Retry` object for fine-grained control 161s over different types of retries. 161s Pass an integer number to retry connection errors that many times, 161s but no other types of errors. Pass zero to never retry. 161s 161s If ``False``, then retries are disabled and any exception is raised 161s immediately. Also, instead of raising a MaxRetryError on redirects, 161s the redirect response will be returned. 161s 161s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 161s 161s :param timeout: 161s If specified, overrides the default timeout for this one 161s request. It may be a float (in seconds) or an instance of 161s :class:`urllib3.util.Timeout`. 161s 161s :param chunked: 161s If True, urllib3 will send the body using chunked transfer 161s encoding. Otherwise, urllib3 will send the body using the standard 161s content-length form. Defaults to False. 161s 161s :param response_conn: 161s Set this to ``None`` if you will handle releasing the connection or 161s set the connection to have the response release it. 161s 161s :param preload_content: 161s If True, the response's body will be preloaded during construction. 161s 161s :param decode_content: 161s If True, will attempt to decode the body based on the 161s 'content-encoding' header. 161s 161s :param enforce_content_length: 161s Enforce content length checking. Body returned by server must match 161s value of Content-Length header, if present. Otherwise, raise error. 161s """ 161s self.num_requests += 1 161s 161s timeout_obj = self._get_timeout(timeout) 161s timeout_obj.start_connect() 161s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 161s 161s try: 161s # Trigger any extra validation we need to do. 161s try: 161s self._validate_conn(conn) 161s except (SocketTimeout, BaseSSLError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 161s raise 161s 161s # _validate_conn() starts the connection to an HTTPS proxy 161s # so we need to wrap errors with 'ProxyError' here too. 161s except ( 161s OSError, 161s NewConnectionError, 161s TimeoutError, 161s BaseSSLError, 161s CertificateError, 161s SSLError, 161s ) as e: 161s new_e: Exception = e 161s if isinstance(e, (BaseSSLError, CertificateError)): 161s new_e = SSLError(e) 161s # If the connection didn't successfully connect to it's proxy 161s # then there 161s if isinstance( 161s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 161s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 161s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 161s raise new_e 161s 161s # conn.request() calls http.client.*.request, not the method in 161s # urllib3.request. It also calls makefile (recv) on the socket. 161s try: 161s conn.request( 161s method, 161s url, 161s body=body, 161s headers=headers, 161s chunked=chunked, 161s preload_content=preload_content, 161s decode_content=decode_content, 161s enforce_content_length=enforce_content_length, 161s ) 161s 161s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 161s # legitimately able to close the connection after sending a valid response. 161s # With this behaviour, the received response is still readable. 161s except BrokenPipeError: 161s pass 161s except OSError as e: 161s # MacOS/Linux 161s # EPROTOTYPE and ECONNRESET are needed on macOS 161s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 161s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 161s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 161s raise 161s 161s # Reset the timeout for the recv() on the socket 161s read_timeout = timeout_obj.read_timeout 161s 161s if not conn.is_closed: 161s # In Python 3 socket.py will catch EAGAIN and return None when you 161s # try and read into the file pointer created by http.client, which 161s # instead raises a BadStatusLine exception. Instead of catching 161s # the exception and assuming all BadStatusLine exceptions are read 161s # timeouts, check for a zero timeout before making the request. 161s if read_timeout == 0: 161s raise ReadTimeoutError( 161s self, url, f"Read timed out. (read timeout={read_timeout})" 161s ) 161s conn.timeout = read_timeout 161s 161s # Receive the response from the server 161s try: 161s response = conn.getresponse() 161s except (BaseSSLError, OSError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 161s raise 161s 161s # Set properties that are used by the pooling layer. 161s response.retries = retries 161s response._connection = response_conn # type: ignore[attr-defined] 161s response._pool = self # type: ignore[attr-defined] 161s 161s log.debug( 161s '%s://%s:%s "%s %s %s" %s %s', 161s self.scheme, 161s self.host, 161s self.port, 161s method, 161s url, 161s > response.version_string, 161s response.status, 161s response.length_remaining, 161s ) 161s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 161s 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 161s ----------------------------- Captured stderr call ----------------------------- 161s 127.0.0.1 - - [18/Jan/2025 02:32:11] "GET /basic-auth/user/passwd HTTP/1.1" 401 0 161s _______________________________ test_post[https] _______________________________ 161s 161s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_post_https_0') 161s httpbin_both = 161s verify_pool_mgr = 161s 161s def test_post(tmpdir, httpbin_both, verify_pool_mgr): 161s """Ensure that we can post and cache the results""" 161s data = {"key1": "value1", "key2": "value2"} 161s url = httpbin_both.url + "/post" 161s with vcr.use_cassette(str(tmpdir.join("verify_pool_mgr.yaml"))): 161s > req1 = verify_pool_mgr.request("POST", url, data).data 161s 161s tests/integration/test_urllib3.py:94: 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:143: in request 161s return self.request_encode_body( 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:278: in request_encode_body 161s return self.urlopen(method, url, **extra_kw) 161s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 161s response = conn.urlopen(method, u.request_uri, **kw) 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 161s response = self._make_request( 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s 161s self = 161s conn = 161s method = 'POST', url = '/post', body = {'key1': 'value1', 'key2': 'value2'} 161s headers = HTTPHeaderDict({}) 161s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 161s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 161s chunked = False, response_conn = None, preload_content = True 161s decode_content = True, enforce_content_length = True 161s 161s def _make_request( 161s self, 161s conn: BaseHTTPConnection, 161s method: str, 161s url: str, 161s body: _TYPE_BODY | None = None, 161s headers: typing.Mapping[str, str] | None = None, 161s retries: Retry | None = None, 161s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 161s chunked: bool = False, 161s response_conn: BaseHTTPConnection | None = None, 161s preload_content: bool = True, 161s decode_content: bool = True, 161s enforce_content_length: bool = True, 161s ) -> BaseHTTPResponse: 161s """ 161s Perform a request on a given urllib connection object taken from our 161s pool. 161s 161s :param conn: 161s a connection from one of our connection pools 161s 161s :param method: 161s HTTP request method (such as GET, POST, PUT, etc.) 161s 161s :param url: 161s The URL to perform the request on. 161s 161s :param body: 161s Data to send in the request body, either :class:`str`, :class:`bytes`, 161s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 161s 161s :param headers: 161s Dictionary of custom headers to send, such as User-Agent, 161s If-None-Match, etc. If None, pool headers are used. If provided, 161s these headers completely replace any pool-specific headers. 161s 161s :param retries: 161s Configure the number of retries to allow before raising a 161s :class:`~urllib3.exceptions.MaxRetryError` exception. 161s 161s Pass ``None`` to retry until you receive a response. Pass a 161s :class:`~urllib3.util.retry.Retry` object for fine-grained control 161s over different types of retries. 161s Pass an integer number to retry connection errors that many times, 161s but no other types of errors. Pass zero to never retry. 161s 161s If ``False``, then retries are disabled and any exception is raised 161s immediately. Also, instead of raising a MaxRetryError on redirects, 161s the redirect response will be returned. 161s 161s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 161s 161s :param timeout: 161s If specified, overrides the default timeout for this one 161s request. It may be a float (in seconds) or an instance of 161s :class:`urllib3.util.Timeout`. 161s 161s :param chunked: 161s If True, urllib3 will send the body using chunked transfer 161s encoding. Otherwise, urllib3 will send the body using the standard 161s content-length form. Defaults to False. 161s 161s :param response_conn: 161s Set this to ``None`` if you will handle releasing the connection or 161s set the connection to have the response release it. 161s 161s :param preload_content: 161s If True, the response's body will be preloaded during construction. 161s 161s :param decode_content: 161s If True, will attempt to decode the body based on the 161s 'content-encoding' header. 161s 161s :param enforce_content_length: 161s Enforce content length checking. Body returned by server must match 161s value of Content-Length header, if present. Otherwise, raise error. 161s """ 161s self.num_requests += 1 161s 161s timeout_obj = self._get_timeout(timeout) 161s timeout_obj.start_connect() 161s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 161s 161s try: 161s # Trigger any extra validation we need to do. 161s try: 161s self._validate_conn(conn) 161s except (SocketTimeout, BaseSSLError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 161s raise 161s 161s # _validate_conn() starts the connection to an HTTPS proxy 161s # so we need to wrap errors with 'ProxyError' here too. 161s except ( 161s OSError, 161s NewConnectionError, 161s TimeoutError, 161s BaseSSLError, 161s CertificateError, 161s SSLError, 161s ) as e: 161s new_e: Exception = e 161s if isinstance(e, (BaseSSLError, CertificateError)): 161s new_e = SSLError(e) 161s # If the connection didn't successfully connect to it's proxy 161s # then there 161s if isinstance( 161s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 161s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 161s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 161s raise new_e 161s 161s # conn.request() calls http.client.*.request, not the method in 161s # urllib3.request. It also calls makefile (recv) on the socket. 161s try: 161s conn.request( 161s method, 161s url, 161s body=body, 161s headers=headers, 161s chunked=chunked, 161s preload_content=preload_content, 161s decode_content=decode_content, 161s enforce_content_length=enforce_content_length, 161s ) 161s 161s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 161s # legitimately able to close the connection after sending a valid response. 161s # With this behaviour, the received response is still readable. 161s except BrokenPipeError: 161s pass 161s except OSError as e: 161s # MacOS/Linux 161s # EPROTOTYPE and ECONNRESET are needed on macOS 161s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 161s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 161s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 161s raise 161s 161s # Reset the timeout for the recv() on the socket 161s read_timeout = timeout_obj.read_timeout 161s 161s if not conn.is_closed: 161s # In Python 3 socket.py will catch EAGAIN and return None when you 161s # try and read into the file pointer created by http.client, which 161s # instead raises a BadStatusLine exception. Instead of catching 161s # the exception and assuming all BadStatusLine exceptions are read 161s # timeouts, check for a zero timeout before making the request. 161s if read_timeout == 0: 161s raise ReadTimeoutError( 161s self, url, f"Read timed out. (read timeout={read_timeout})" 161s ) 161s conn.timeout = read_timeout 161s 161s # Receive the response from the server 161s try: 161s response = conn.getresponse() 161s except (BaseSSLError, OSError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 161s raise 161s 161s # Set properties that are used by the pooling layer. 161s response.retries = retries 161s response._connection = response_conn # type: ignore[attr-defined] 161s response._pool = self # type: ignore[attr-defined] 161s 161s log.debug( 161s '%s://%s:%s "%s %s %s" %s %s', 161s self.scheme, 161s self.host, 161s self.port, 161s method, 161s url, 161s > response.version_string, 161s response.status, 161s response.length_remaining, 161s ) 161s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 161s 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 161s ----------------------------- Captured stderr call ----------------------------- 161s 127.0.0.1 - - [18/Jan/2025 02:32:11] "POST /post HTTP/1.1" 501 159 161s _______________________________ test_gzip[https] _______________________________ 161s 161s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_gzip_https_0') 161s httpbin_both = 161s verify_pool_mgr = 161s 161s def test_gzip(tmpdir, httpbin_both, verify_pool_mgr): 161s """ 161s Ensure that requests (actually urllib3) is able to automatically decompress 161s the response body 161s """ 161s url = httpbin_both.url + "/gzip" 161s response = verify_pool_mgr.request("GET", url) 161s 161s with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))): 161s > response = verify_pool_mgr.request("GET", url) 161s 161s tests/integration/test_urllib3.py:140: 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 161s return self.request_encode_url( 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 161s return self.urlopen(method, url, **extra_kw) 161s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 161s response = conn.urlopen(method, u.request_uri, **kw) 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 161s response = self._make_request( 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s 161s self = 161s conn = 161s method = 'GET', url = '/gzip', body = None, headers = {} 161s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 161s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 161s chunked = False, response_conn = None, preload_content = True 161s decode_content = True, enforce_content_length = True 161s 161s def _make_request( 161s self, 161s conn: BaseHTTPConnection, 161s method: str, 161s url: str, 161s body: _TYPE_BODY | None = None, 161s headers: typing.Mapping[str, str] | None = None, 161s retries: Retry | None = None, 161s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 161s chunked: bool = False, 161s response_conn: BaseHTTPConnection | None = None, 161s preload_content: bool = True, 161s decode_content: bool = True, 161s enforce_content_length: bool = True, 161s ) -> BaseHTTPResponse: 161s """ 161s Perform a request on a given urllib connection object taken from our 161s pool. 161s 161s :param conn: 161s a connection from one of our connection pools 161s 161s :param method: 161s HTTP request method (such as GET, POST, PUT, etc.) 161s 161s :param url: 161s The URL to perform the request on. 161s 161s :param body: 161s Data to send in the request body, either :class:`str`, :class:`bytes`, 161s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 161s 161s :param headers: 161s Dictionary of custom headers to send, such as User-Agent, 161s If-None-Match, etc. If None, pool headers are used. If provided, 161s these headers completely replace any pool-specific headers. 161s 161s :param retries: 161s Configure the number of retries to allow before raising a 161s :class:`~urllib3.exceptions.MaxRetryError` exception. 161s 161s Pass ``None`` to retry until you receive a response. Pass a 161s :class:`~urllib3.util.retry.Retry` object for fine-grained control 161s over different types of retries. 161s Pass an integer number to retry connection errors that many times, 161s but no other types of errors. Pass zero to never retry. 161s 161s If ``False``, then retries are disabled and any exception is raised 161s immediately. Also, instead of raising a MaxRetryError on redirects, 161s the redirect response will be returned. 161s 161s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 161s 161s :param timeout: 161s If specified, overrides the default timeout for this one 161s request. It may be a float (in seconds) or an instance of 161s :class:`urllib3.util.Timeout`. 161s 161s :param chunked: 161s If True, urllib3 will send the body using chunked transfer 161s encoding. Otherwise, urllib3 will send the body using the standard 161s content-length form. Defaults to False. 161s 161s :param response_conn: 161s Set this to ``None`` if you will handle releasing the connection or 161s set the connection to have the response release it. 161s 161s :param preload_content: 161s If True, the response's body will be preloaded during construction. 161s 161s :param decode_content: 161s If True, will attempt to decode the body based on the 161s 'content-encoding' header. 161s 161s :param enforce_content_length: 161s Enforce content length checking. Body returned by server must match 161s value of Content-Length header, if present. Otherwise, raise error. 161s """ 161s self.num_requests += 1 161s 161s timeout_obj = self._get_timeout(timeout) 161s timeout_obj.start_connect() 161s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 161s 161s try: 161s # Trigger any extra validation we need to do. 161s try: 161s self._validate_conn(conn) 161s except (SocketTimeout, BaseSSLError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 161s raise 161s 161s # _validate_conn() starts the connection to an HTTPS proxy 161s # so we need to wrap errors with 'ProxyError' here too. 161s except ( 161s OSError, 161s NewConnectionError, 161s TimeoutError, 161s BaseSSLError, 161s CertificateError, 161s SSLError, 161s ) as e: 161s new_e: Exception = e 161s if isinstance(e, (BaseSSLError, CertificateError)): 161s new_e = SSLError(e) 161s # If the connection didn't successfully connect to it's proxy 161s # then there 161s if isinstance( 161s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 161s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 161s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 161s raise new_e 161s 161s # conn.request() calls http.client.*.request, not the method in 161s # urllib3.request. It also calls makefile (recv) on the socket. 161s try: 161s conn.request( 161s method, 161s url, 161s body=body, 161s headers=headers, 161s chunked=chunked, 161s preload_content=preload_content, 161s decode_content=decode_content, 161s enforce_content_length=enforce_content_length, 161s ) 161s 161s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 161s # legitimately able to close the connection after sending a valid response. 161s # With this behaviour, the received response is still readable. 161s except BrokenPipeError: 161s pass 161s except OSError as e: 161s # MacOS/Linux 161s # EPROTOTYPE and ECONNRESET are needed on macOS 161s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 161s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 161s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 161s raise 161s 161s # Reset the timeout for the recv() on the socket 161s read_timeout = timeout_obj.read_timeout 161s 161s if not conn.is_closed: 161s # In Python 3 socket.py will catch EAGAIN and return None when you 161s # try and read into the file pointer created by http.client, which 161s # instead raises a BadStatusLine exception. Instead of catching 161s # the exception and assuming all BadStatusLine exceptions are read 161s # timeouts, check for a zero timeout before making the request. 161s if read_timeout == 0: 161s raise ReadTimeoutError( 161s self, url, f"Read timed out. (read timeout={read_timeout})" 161s ) 161s conn.timeout = read_timeout 161s 161s # Receive the response from the server 161s try: 161s response = conn.getresponse() 161s except (BaseSSLError, OSError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 161s raise 161s 161s # Set properties that are used by the pooling layer. 161s response.retries = retries 161s response._connection = response_conn # type: ignore[attr-defined] 161s response._pool = self # type: ignore[attr-defined] 161s 161s log.debug( 161s '%s://%s:%s "%s %s %s" %s %s', 161s self.scheme, 161s self.host, 161s self.port, 161s method, 161s url, 161s > response.version_string, 161s response.status, 161s response.length_remaining, 161s ) 161s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 161s 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 161s ----------------------------- Captured stderr call ----------------------------- 161s 127.0.0.1 - - [18/Jan/2025 02:32:12] "GET /gzip HTTP/1.1" 200 165 161s 127.0.0.1 - - [18/Jan/2025 02:32:12] "GET /gzip HTTP/1.1" 200 165 161s ________________________________ test_use_proxy ________________________________ 161s 161s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_use_proxy0') 161s httpbin = 161s proxy_server = 'http://0.0.0.0:59585' 161s 161s def test_use_proxy(tmpdir, httpbin, proxy_server): 161s """Ensure that it works with a proxy.""" 161s with vcr.use_cassette(str(tmpdir.join("proxy.yaml"))): 161s > response = requests.get(httpbin.url, proxies={"http": proxy_server}) 161s 161s tests/integration/test_proxy.py:53: 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s /usr/lib/python3/dist-packages/requests/api.py:73: in get 161s return request("get", url, params=params, **kwargs) 161s /usr/lib/python3/dist-packages/requests/api.py:59: in request 161s return session.request(method=method, url=url, **kwargs) 161s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 161s resp = self.send(prep, **send_kwargs) 161s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 161s r = adapter.send(request, **kwargs) 161s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 161s resp = conn.urlopen( 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 161s response = self._make_request( 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s 161s self = 161s conn = 161s method = 'GET', url = 'http://127.0.0.1:44083/', body = None 161s headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 161s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 161s timeout = Timeout(connect=None, read=None, total=None), chunked = False 161s response_conn = 161s preload_content = False, decode_content = False, enforce_content_length = True 161s 161s def _make_request( 161s self, 161s conn: BaseHTTPConnection, 161s method: str, 161s url: str, 161s body: _TYPE_BODY | None = None, 161s headers: typing.Mapping[str, str] | None = None, 161s retries: Retry | None = None, 161s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 161s chunked: bool = False, 161s response_conn: BaseHTTPConnection | None = None, 161s preload_content: bool = True, 161s decode_content: bool = True, 161s enforce_content_length: bool = True, 161s ) -> BaseHTTPResponse: 161s """ 161s Perform a request on a given urllib connection object taken from our 161s pool. 161s 161s :param conn: 161s a connection from one of our connection pools 161s 161s :param method: 161s HTTP request method (such as GET, POST, PUT, etc.) 161s 161s :param url: 161s The URL to perform the request on. 161s 161s :param body: 161s Data to send in the request body, either :class:`str`, :class:`bytes`, 161s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 161s 161s :param headers: 161s Dictionary of custom headers to send, such as User-Agent, 161s If-None-Match, etc. If None, pool headers are used. If provided, 161s these headers completely replace any pool-specific headers. 161s 161s :param retries: 161s Configure the number of retries to allow before raising a 161s :class:`~urllib3.exceptions.MaxRetryError` exception. 161s 161s Pass ``None`` to retry until you receive a response. Pass a 161s :class:`~urllib3.util.retry.Retry` object for fine-grained control 161s over different types of retries. 161s Pass an integer number to retry connection errors that many times, 161s but no other types of errors. Pass zero to never retry. 161s 161s If ``False``, then retries are disabled and any exception is raised 161s immediately. Also, instead of raising a MaxRetryError on redirects, 161s the redirect response will be returned. 161s 161s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 161s 161s :param timeout: 161s If specified, overrides the default timeout for this one 161s request. It may be a float (in seconds) or an instance of 161s :class:`urllib3.util.Timeout`. 161s 161s :param chunked: 161s If True, urllib3 will send the body using chunked transfer 161s encoding. Otherwise, urllib3 will send the body using the standard 161s content-length form. Defaults to False. 161s 161s :param response_conn: 161s Set this to ``None`` if you will handle releasing the connection or 161s set the connection to have the response release it. 161s 161s :param preload_content: 161s If True, the response's body will be preloaded during construction. 161s 161s :param decode_content: 161s If True, will attempt to decode the body based on the 161s 'content-encoding' header. 161s 161s :param enforce_content_length: 161s Enforce content length checking. Body returned by server must match 161s value of Content-Length header, if present. Otherwise, raise error. 161s """ 161s self.num_requests += 1 161s 161s timeout_obj = self._get_timeout(timeout) 161s timeout_obj.start_connect() 161s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 161s 161s try: 161s # Trigger any extra validation we need to do. 161s try: 161s self._validate_conn(conn) 161s except (SocketTimeout, BaseSSLError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 161s raise 161s 161s # _validate_conn() starts the connection to an HTTPS proxy 161s # so we need to wrap errors with 'ProxyError' here too. 161s except ( 161s OSError, 161s NewConnectionError, 161s TimeoutError, 161s BaseSSLError, 161s CertificateError, 161s SSLError, 161s ) as e: 161s new_e: Exception = e 161s if isinstance(e, (BaseSSLError, CertificateError)): 161s new_e = SSLError(e) 161s # If the connection didn't successfully connect to it's proxy 161s # then there 161s if isinstance( 161s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 161s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 161s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 161s raise new_e 161s 161s # conn.request() calls http.client.*.request, not the method in 161s # urllib3.request. It also calls makefile (recv) on the socket. 161s try: 161s conn.request( 161s method, 161s url, 161s body=body, 161s headers=headers, 161s chunked=chunked, 161s preload_content=preload_content, 161s decode_content=decode_content, 161s enforce_content_length=enforce_content_length, 161s ) 161s 161s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 161s # legitimately able to close the connection after sending a valid response. 161s # With this behaviour, the received response is still readable. 161s except BrokenPipeError: 161s pass 161s except OSError as e: 161s # MacOS/Linux 161s # EPROTOTYPE and ECONNRESET are needed on macOS 161s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 161s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 161s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 161s raise 161s 161s # Reset the timeout for the recv() on the socket 161s read_timeout = timeout_obj.read_timeout 161s 161s if not conn.is_closed: 161s # In Python 3 socket.py will catch EAGAIN and return None when you 161s # try and read into the file pointer created by http.client, which 161s # instead raises a BadStatusLine exception. Instead of catching 161s # the exception and assuming all BadStatusLine exceptions are read 161s # timeouts, check for a zero timeout before making the request. 161s if read_timeout == 0: 161s raise ReadTimeoutError( 161s self, url, f"Read timed out. (read timeout={read_timeout})" 161s ) 161s conn.timeout = read_timeout 161s 161s # Receive the response from the server 161s try: 161s response = conn.getresponse() 161s except (BaseSSLError, OSError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 161s raise 161s 161s # Set properties that are used by the pooling layer. 161s response.retries = retries 161s response._connection = response_conn # type: ignore[attr-defined] 161s response._pool = self # type: ignore[attr-defined] 161s 161s log.debug( 161s '%s://%s:%s "%s %s %s" %s %s', 161s self.scheme, 161s self.host, 161s self.port, 161s method, 161s url, 161s > response.version_string, 161s response.status, 161s response.length_remaining, 161s ) 161s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 161s 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 161s ----------------------------- Captured stderr call ----------------------------- 161s 127.0.0.1 - - [18/Jan/2025 02:32:12] "GET / HTTP/1.1" 200 9358 161s 127.0.0.1 - - [18/Jan/2025 02:32:12] "GET http://127.0.0.1:44083/ HTTP/1.1" 200 - 161s ______________________________ test_cross_scheme _______________________________ 161s 161s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_cross_scheme2') 161s httpbin = 161s httpbin_secure = 161s verify_pool_mgr = 161s 161s def test_cross_scheme(tmpdir, httpbin, httpbin_secure, verify_pool_mgr): 161s """Ensure that requests between schemes are treated separately""" 161s # First fetch a url under http, and then again under https and then 161s # ensure that we haven't served anything out of cache, and we have two 161s # requests / response pairs in the cassette 161s with vcr.use_cassette(str(tmpdir.join("cross_scheme.yaml"))) as cass: 161s > verify_pool_mgr.request("GET", httpbin_secure.url) 161s 161s tests/integration/test_urllib3.py:125: 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 161s return self.request_encode_url( 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 161s return self.urlopen(method, url, **extra_kw) 161s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 161s response = conn.urlopen(method, u.request_uri, **kw) 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 161s response = self._make_request( 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s 161s self = 161s conn = 161s method = 'GET', url = '/', body = None, headers = {} 161s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 161s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 161s chunked = False, response_conn = None, preload_content = True 161s decode_content = True, enforce_content_length = True 161s 161s def _make_request( 161s self, 161s conn: BaseHTTPConnection, 161s method: str, 161s url: str, 161s body: _TYPE_BODY | None = None, 161s headers: typing.Mapping[str, str] | None = None, 161s retries: Retry | None = None, 161s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 161s chunked: bool = False, 161s response_conn: BaseHTTPConnection | None = None, 161s preload_content: bool = True, 161s decode_content: bool = True, 161s enforce_content_length: bool = True, 161s ) -> BaseHTTPResponse: 161s """ 161s Perform a request on a given urllib connection object taken from our 161s pool. 161s 161s :param conn: 161s a connection from one of our connection pools 161s 161s :param method: 161s HTTP request method (such as GET, POST, PUT, etc.) 161s 161s :param url: 161s The URL to perform the request on. 161s 161s :param body: 161s Data to send in the request body, either :class:`str`, :class:`bytes`, 161s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 161s 161s :param headers: 161s Dictionary of custom headers to send, such as User-Agent, 161s If-None-Match, etc. If None, pool headers are used. If provided, 161s these headers completely replace any pool-specific headers. 161s 161s :param retries: 161s Configure the number of retries to allow before raising a 161s :class:`~urllib3.exceptions.MaxRetryError` exception. 161s 161s Pass ``None`` to retry until you receive a response. Pass a 161s :class:`~urllib3.util.retry.Retry` object for fine-grained control 161s over different types of retries. 161s Pass an integer number to retry connection errors that many times, 161s but no other types of errors. Pass zero to never retry. 161s 161s If ``False``, then retries are disabled and any exception is raised 161s immediately. Also, instead of raising a MaxRetryError on redirects, 161s the redirect response will be returned. 161s 161s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 161s 161s :param timeout: 161s If specified, overrides the default timeout for this one 161s request. It may be a float (in seconds) or an instance of 161s :class:`urllib3.util.Timeout`. 161s 161s :param chunked: 161s If True, urllib3 will send the body using chunked transfer 161s encoding. Otherwise, urllib3 will send the body using the standard 161s content-length form. Defaults to False. 161s 161s :param response_conn: 161s Set this to ``None`` if you will handle releasing the connection or 161s set the connection to have the response release it. 161s 161s :param preload_content: 161s If True, the response's body will be preloaded during construction. 161s 161s :param decode_content: 161s If True, will attempt to decode the body based on the 161s 'content-encoding' header. 161s 161s :param enforce_content_length: 161s Enforce content length checking. Body returned by server must match 161s value of Content-Length header, if present. Otherwise, raise error. 161s """ 161s self.num_requests += 1 161s 161s timeout_obj = self._get_timeout(timeout) 161s timeout_obj.start_connect() 161s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 161s 161s try: 161s # Trigger any extra validation we need to do. 161s try: 161s self._validate_conn(conn) 161s except (SocketTimeout, BaseSSLError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 161s raise 161s 161s # _validate_conn() starts the connection to an HTTPS proxy 161s # so we need to wrap errors with 'ProxyError' here too. 161s except ( 161s OSError, 161s NewConnectionError, 161s TimeoutError, 161s BaseSSLError, 161s CertificateError, 161s SSLError, 161s ) as e: 161s new_e: Exception = e 161s if isinstance(e, (BaseSSLError, CertificateError)): 161s new_e = SSLError(e) 161s # If the connection didn't successfully connect to it's proxy 161s # then there 161s if isinstance( 161s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 161s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 161s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 161s raise new_e 161s 161s # conn.request() calls http.client.*.request, not the method in 161s # urllib3.request. It also calls makefile (recv) on the socket. 161s try: 161s conn.request( 161s method, 161s url, 161s body=body, 161s headers=headers, 161s chunked=chunked, 161s preload_content=preload_content, 161s decode_content=decode_content, 161s enforce_content_length=enforce_content_length, 161s ) 161s 161s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 161s # legitimately able to close the connection after sending a valid response. 161s # With this behaviour, the received response is still readable. 161s except BrokenPipeError: 161s pass 161s except OSError as e: 161s # MacOS/Linux 161s # EPROTOTYPE and ECONNRESET are needed on macOS 161s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 161s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 161s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 161s raise 161s 161s # Reset the timeout for the recv() on the socket 161s read_timeout = timeout_obj.read_timeout 161s 161s if not conn.is_closed: 161s # In Python 3 socket.py will catch EAGAIN and return None when you 161s # try and read into the file pointer created by http.client, which 161s # instead raises a BadStatusLine exception. Instead of catching 161s # the exception and assuming all BadStatusLine exceptions are read 161s # timeouts, check for a zero timeout before making the request. 161s if read_timeout == 0: 161s raise ReadTimeoutError( 161s self, url, f"Read timed out. (read timeout={read_timeout})" 161s ) 161s conn.timeout = read_timeout 161s 161s # Receive the response from the server 161s try: 161s response = conn.getresponse() 161s except (BaseSSLError, OSError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 161s raise 161s 161s # Set properties that are used by the pooling layer. 161s response.retries = retries 161s response._connection = response_conn # type: ignore[attr-defined] 161s response._pool = self # type: ignore[attr-defined] 161s 161s log.debug( 161s '%s://%s:%s "%s %s %s" %s %s', 161s self.scheme, 161s self.host, 161s self.port, 161s method, 161s url, 161s > response.version_string, 161s response.status, 161s response.length_remaining, 161s ) 161s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 161s 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 161s ----------------------------- Captured stderr call ----------------------------- 161s 127.0.0.1 - - [18/Jan/2025 02:32:12] "GET / HTTP/1.1" 200 9358 161s ___________________ test_https_with_cert_validation_disabled ___________________ 161s 161s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_https_with_cert_validatio0') 161s httpbin_secure = 161s pool_mgr = 161s 161s def test_https_with_cert_validation_disabled(tmpdir, httpbin_secure, pool_mgr): 161s with vcr.use_cassette(str(tmpdir.join("cert_validation_disabled.yaml"))): 161s > pool_mgr.request("GET", httpbin_secure.url) 161s 161s tests/integration/test_urllib3.py:149: 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 161s return self.request_encode_url( 161s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 161s return self.urlopen(method, url, **extra_kw) 161s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 161s response = conn.urlopen(method, u.request_uri, **kw) 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 161s response = self._make_request( 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s 161s self = 161s conn = 161s method = 'GET', url = '/', body = None, headers = {} 161s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 161s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 161s chunked = False, response_conn = None, preload_content = True 161s decode_content = True, enforce_content_length = True 161s 161s def _make_request( 161s self, 161s conn: BaseHTTPConnection, 161s method: str, 161s url: str, 161s body: _TYPE_BODY | None = None, 161s headers: typing.Mapping[str, str] | None = None, 161s retries: Retry | None = None, 161s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 161s chunked: bool = False, 161s response_conn: BaseHTTPConnection | None = None, 161s preload_content: bool = True, 161s decode_content: bool = True, 161s enforce_content_length: bool = True, 161s ) -> BaseHTTPResponse: 161s """ 161s Perform a request on a given urllib connection object taken from our 161s pool. 161s 161s :param conn: 161s a connection from one of our connection pools 161s 161s :param method: 161s HTTP request method (such as GET, POST, PUT, etc.) 161s 161s :param url: 161s The URL to perform the request on. 161s 161s :param body: 161s Data to send in the request body, either :class:`str`, :class:`bytes`, 161s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 161s 161s :param headers: 161s Dictionary of custom headers to send, such as User-Agent, 161s If-None-Match, etc. If None, pool headers are used. If provided, 161s these headers completely replace any pool-specific headers. 161s 161s :param retries: 161s Configure the number of retries to allow before raising a 161s :class:`~urllib3.exceptions.MaxRetryError` exception. 161s 161s Pass ``None`` to retry until you receive a response. Pass a 161s :class:`~urllib3.util.retry.Retry` object for fine-grained control 161s over different types of retries. 161s Pass an integer number to retry connection errors that many times, 161s but no other types of errors. Pass zero to never retry. 161s 161s If ``False``, then retries are disabled and any exception is raised 161s immediately. Also, instead of raising a MaxRetryError on redirects, 161s the redirect response will be returned. 161s 161s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 161s 161s :param timeout: 161s If specified, overrides the default timeout for this one 161s request. It may be a float (in seconds) or an instance of 161s :class:`urllib3.util.Timeout`. 161s 161s :param chunked: 161s If True, urllib3 will send the body using chunked transfer 161s encoding. Otherwise, urllib3 will send the body using the standard 161s content-length form. Defaults to False. 161s 161s :param response_conn: 161s Set this to ``None`` if you will handle releasing the connection or 161s set the connection to have the response release it. 161s 161s :param preload_content: 161s If True, the response's body will be preloaded during construction. 161s 161s :param decode_content: 161s If True, will attempt to decode the body based on the 161s 'content-encoding' header. 161s 161s :param enforce_content_length: 161s Enforce content length checking. Body returned by server must match 161s value of Content-Length header, if present. Otherwise, raise error. 161s """ 161s self.num_requests += 1 161s 161s timeout_obj = self._get_timeout(timeout) 161s timeout_obj.start_connect() 161s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 161s 161s try: 161s # Trigger any extra validation we need to do. 161s try: 161s self._validate_conn(conn) 161s except (SocketTimeout, BaseSSLError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 161s raise 161s 161s # _validate_conn() starts the connection to an HTTPS proxy 161s # so we need to wrap errors with 'ProxyError' here too. 161s except ( 161s OSError, 161s NewConnectionError, 161s TimeoutError, 161s BaseSSLError, 161s CertificateError, 161s SSLError, 161s ) as e: 161s new_e: Exception = e 161s if isinstance(e, (BaseSSLError, CertificateError)): 161s new_e = SSLError(e) 161s # If the connection didn't successfully connect to it's proxy 161s # then there 161s if isinstance( 161s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 161s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 161s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 161s raise new_e 161s 161s # conn.request() calls http.client.*.request, not the method in 161s # urllib3.request. It also calls makefile (recv) on the socket. 161s try: 161s conn.request( 161s method, 161s url, 161s body=body, 161s headers=headers, 161s chunked=chunked, 161s preload_content=preload_content, 161s decode_content=decode_content, 161s enforce_content_length=enforce_content_length, 161s ) 161s 161s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 161s # legitimately able to close the connection after sending a valid response. 161s # With this behaviour, the received response is still readable. 161s except BrokenPipeError: 161s pass 161s except OSError as e: 161s # MacOS/Linux 161s # EPROTOTYPE and ECONNRESET are needed on macOS 161s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 161s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 161s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 161s raise 161s 161s # Reset the timeout for the recv() on the socket 161s read_timeout = timeout_obj.read_timeout 161s 161s if not conn.is_closed: 161s # In Python 3 socket.py will catch EAGAIN and return None when you 161s # try and read into the file pointer created by http.client, which 161s # instead raises a BadStatusLine exception. Instead of catching 161s # the exception and assuming all BadStatusLine exceptions are read 161s # timeouts, check for a zero timeout before making the request. 161s if read_timeout == 0: 161s raise ReadTimeoutError( 161s self, url, f"Read timed out. (read timeout={read_timeout})" 161s ) 161s conn.timeout = read_timeout 161s 161s # Receive the response from the server 161s try: 161s response = conn.getresponse() 161s except (BaseSSLError, OSError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 161s raise 161s 161s # Set properties that are used by the pooling layer. 161s response.retries = retries 161s response._connection = response_conn # type: ignore[attr-defined] 161s response._pool = self # type: ignore[attr-defined] 161s 161s log.debug( 161s '%s://%s:%s "%s %s %s" %s %s', 161s self.scheme, 161s self.host, 161s self.port, 161s method, 161s url, 161s > response.version_string, 161s response.status, 161s response.length_remaining, 161s ) 161s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 161s 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 161s ----------------------------- Captured stderr call ----------------------------- 161s 127.0.0.1 - - [18/Jan/2025 02:32:12] "GET / HTTP/1.1" 200 9358 161s _____________________________ test_domain_redirect _____________________________ 161s 161s def test_domain_redirect(): 161s """Ensure that redirects across domains are considered unique""" 161s # In this example, seomoz.org redirects to moz.com, and if those 161s # requests are considered identical, then we'll be stuck in a redirect 161s # loop. 161s url = "http://seomoz.org/" 161s with vcr.use_cassette("tests/fixtures/wild/domain_redirect.yaml") as cass: 161s > requests.get(url, headers={"User-Agent": "vcrpy-test"}) 161s 161s tests/integration/test_wild.py:20: 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s /usr/lib/python3/dist-packages/requests/api.py:73: in get 161s return request("get", url, params=params, **kwargs) 161s /usr/lib/python3/dist-packages/requests/api.py:59: in request 161s return session.request(method=method, url=url, **kwargs) 161s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 161s resp = self.send(prep, **send_kwargs) 161s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 161s r = adapter.send(request, **kwargs) 161s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 161s resp = conn.urlopen( 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 161s response = self._make_request( 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s 161s self = 161s conn = 161s method = 'GET', url = '/', body = None 161s headers = {'User-Agent': 'vcrpy-test', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 161s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 161s timeout = Timeout(connect=None, read=None, total=None), chunked = False 161s response_conn = 161s preload_content = False, decode_content = False, enforce_content_length = True 161s 161s def _make_request( 161s self, 161s conn: BaseHTTPConnection, 161s method: str, 161s url: str, 161s body: _TYPE_BODY | None = None, 161s headers: typing.Mapping[str, str] | None = None, 161s retries: Retry | None = None, 161s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 161s chunked: bool = False, 161s response_conn: BaseHTTPConnection | None = None, 161s preload_content: bool = True, 161s decode_content: bool = True, 161s enforce_content_length: bool = True, 161s ) -> BaseHTTPResponse: 161s """ 161s Perform a request on a given urllib connection object taken from our 161s pool. 161s 161s :param conn: 161s a connection from one of our connection pools 161s 161s :param method: 161s HTTP request method (such as GET, POST, PUT, etc.) 161s 161s :param url: 161s The URL to perform the request on. 161s 161s :param body: 161s Data to send in the request body, either :class:`str`, :class:`bytes`, 161s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 161s 161s :param headers: 161s Dictionary of custom headers to send, such as User-Agent, 161s If-None-Match, etc. If None, pool headers are used. If provided, 161s these headers completely replace any pool-specific headers. 161s 161s :param retries: 161s Configure the number of retries to allow before raising a 161s :class:`~urllib3.exceptions.MaxRetryError` exception. 161s 161s Pass ``None`` to retry until you receive a response. Pass a 161s :class:`~urllib3.util.retry.Retry` object for fine-grained control 161s over different types of retries. 161s Pass an integer number to retry connection errors that many times, 161s but no other types of errors. Pass zero to never retry. 161s 161s If ``False``, then retries are disabled and any exception is raised 161s immediately. Also, instead of raising a MaxRetryError on redirects, 161s the redirect response will be returned. 161s 161s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 161s 161s :param timeout: 161s If specified, overrides the default timeout for this one 161s request. It may be a float (in seconds) or an instance of 161s :class:`urllib3.util.Timeout`. 161s 161s :param chunked: 161s If True, urllib3 will send the body using chunked transfer 161s encoding. Otherwise, urllib3 will send the body using the standard 161s content-length form. Defaults to False. 161s 161s :param response_conn: 161s Set this to ``None`` if you will handle releasing the connection or 161s set the connection to have the response release it. 161s 161s :param preload_content: 161s If True, the response's body will be preloaded during construction. 161s 161s :param decode_content: 161s If True, will attempt to decode the body based on the 161s 'content-encoding' header. 161s 161s :param enforce_content_length: 161s Enforce content length checking. Body returned by server must match 161s value of Content-Length header, if present. Otherwise, raise error. 161s """ 161s self.num_requests += 1 161s 161s timeout_obj = self._get_timeout(timeout) 161s timeout_obj.start_connect() 161s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 161s 161s try: 161s # Trigger any extra validation we need to do. 161s try: 161s self._validate_conn(conn) 161s except (SocketTimeout, BaseSSLError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 161s raise 161s 161s # _validate_conn() starts the connection to an HTTPS proxy 161s # so we need to wrap errors with 'ProxyError' here too. 161s except ( 161s OSError, 161s NewConnectionError, 161s TimeoutError, 161s BaseSSLError, 161s CertificateError, 161s SSLError, 161s ) as e: 161s new_e: Exception = e 161s if isinstance(e, (BaseSSLError, CertificateError)): 161s new_e = SSLError(e) 161s # If the connection didn't successfully connect to it's proxy 161s # then there 161s if isinstance( 161s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 161s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 161s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 161s raise new_e 161s 161s # conn.request() calls http.client.*.request, not the method in 161s # urllib3.request. It also calls makefile (recv) on the socket. 161s try: 161s conn.request( 161s method, 161s url, 161s body=body, 161s headers=headers, 161s chunked=chunked, 161s preload_content=preload_content, 161s decode_content=decode_content, 161s enforce_content_length=enforce_content_length, 161s ) 161s 161s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 161s # legitimately able to close the connection after sending a valid response. 161s # With this behaviour, the received response is still readable. 161s except BrokenPipeError: 161s pass 161s except OSError as e: 161s # MacOS/Linux 161s # EPROTOTYPE and ECONNRESET are needed on macOS 161s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 161s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 161s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 161s raise 161s 161s # Reset the timeout for the recv() on the socket 161s read_timeout = timeout_obj.read_timeout 161s 161s if not conn.is_closed: 161s # In Python 3 socket.py will catch EAGAIN and return None when you 161s # try and read into the file pointer created by http.client, which 161s # instead raises a BadStatusLine exception. Instead of catching 161s # the exception and assuming all BadStatusLine exceptions are read 161s # timeouts, check for a zero timeout before making the request. 161s if read_timeout == 0: 161s raise ReadTimeoutError( 161s self, url, f"Read timed out. (read timeout={read_timeout})" 161s ) 161s conn.timeout = read_timeout 161s 161s # Receive the response from the server 161s try: 161s response = conn.getresponse() 161s except (BaseSSLError, OSError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 161s raise 161s 161s # Set properties that are used by the pooling layer. 161s response.retries = retries 161s response._connection = response_conn # type: ignore[attr-defined] 161s response._pool = self # type: ignore[attr-defined] 161s 161s log.debug( 161s '%s://%s:%s "%s %s %s" %s %s', 161s self.scheme, 161s self.host, 161s self.port, 161s method, 161s url, 161s > response.version_string, 161s response.status, 161s response.length_remaining, 161s ) 161s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 161s 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 161s _________________________________ test_cookies _________________________________ 161s 161s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_cookies0') 161s httpbin = 161s 161s def test_cookies(tmpdir, httpbin): 161s testfile = str(tmpdir.join("cookies.yml")) 161s with vcr.use_cassette(testfile): 161s with requests.Session() as s: 161s > s.get(httpbin.url + "/cookies/set?k1=v1&k2=v2") 161s 161s tests/integration/test_wild.py:67: 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 161s return self.request("GET", url, **kwargs) 161s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 161s resp = self.send(prep, **send_kwargs) 161s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 161s r = adapter.send(request, **kwargs) 161s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 161s resp = conn.urlopen( 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 161s response = self._make_request( 161s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 161s 161s self = 161s conn = 161s method = 'GET', url = '/cookies/set?k1=v1&k2=v2', body = None 161s headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 161s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 161s timeout = Timeout(connect=None, read=None, total=None), chunked = False 161s response_conn = 161s preload_content = False, decode_content = False, enforce_content_length = True 161s 161s def _make_request( 161s self, 161s conn: BaseHTTPConnection, 161s method: str, 161s url: str, 161s body: _TYPE_BODY | None = None, 161s headers: typing.Mapping[str, str] | None = None, 161s retries: Retry | None = None, 161s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 161s chunked: bool = False, 161s response_conn: BaseHTTPConnection | None = None, 161s preload_content: bool = True, 161s decode_content: bool = True, 161s enforce_content_length: bool = True, 161s ) -> BaseHTTPResponse: 161s """ 161s Perform a request on a given urllib connection object taken from our 161s pool. 161s 161s :param conn: 161s a connection from one of our connection pools 161s 161s :param method: 161s HTTP request method (such as GET, POST, PUT, etc.) 161s 161s :param url: 161s The URL to perform the request on. 161s 161s :param body: 161s Data to send in the request body, either :class:`str`, :class:`bytes`, 161s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 161s 161s :param headers: 161s Dictionary of custom headers to send, such as User-Agent, 161s If-None-Match, etc. If None, pool headers are used. If provided, 161s these headers completely replace any pool-specific headers. 161s 161s :param retries: 161s Configure the number of retries to allow before raising a 161s :class:`~urllib3.exceptions.MaxRetryError` exception. 161s 161s Pass ``None`` to retry until you receive a response. Pass a 161s :class:`~urllib3.util.retry.Retry` object for fine-grained control 161s over different types of retries. 161s Pass an integer number to retry connection errors that many times, 161s but no other types of errors. Pass zero to never retry. 161s 161s If ``False``, then retries are disabled and any exception is raised 161s immediately. Also, instead of raising a MaxRetryError on redirects, 161s the redirect response will be returned. 161s 161s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 161s 161s :param timeout: 161s If specified, overrides the default timeout for this one 161s request. It may be a float (in seconds) or an instance of 161s :class:`urllib3.util.Timeout`. 161s 161s :param chunked: 161s If True, urllib3 will send the body using chunked transfer 161s encoding. Otherwise, urllib3 will send the body using the standard 161s content-length form. Defaults to False. 161s 161s :param response_conn: 161s Set this to ``None`` if you will handle releasing the connection or 161s set the connection to have the response release it. 161s 161s :param preload_content: 161s If True, the response's body will be preloaded during construction. 161s 161s :param decode_content: 161s If True, will attempt to decode the body based on the 161s 'content-encoding' header. 161s 161s :param enforce_content_length: 161s Enforce content length checking. Body returned by server must match 161s value of Content-Length header, if present. Otherwise, raise error. 161s """ 161s self.num_requests += 1 161s 161s timeout_obj = self._get_timeout(timeout) 161s timeout_obj.start_connect() 161s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 161s 161s try: 161s # Trigger any extra validation we need to do. 161s try: 161s self._validate_conn(conn) 161s except (SocketTimeout, BaseSSLError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 161s raise 161s 161s # _validate_conn() starts the connection to an HTTPS proxy 161s # so we need to wrap errors with 'ProxyError' here too. 161s except ( 161s OSError, 161s NewConnectionError, 161s TimeoutError, 161s BaseSSLError, 161s CertificateError, 161s SSLError, 161s ) as e: 161s new_e: Exception = e 161s if isinstance(e, (BaseSSLError, CertificateError)): 161s new_e = SSLError(e) 161s # If the connection didn't successfully connect to it's proxy 161s # then there 161s if isinstance( 161s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 161s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 161s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 161s raise new_e 161s 161s # conn.request() calls http.client.*.request, not the method in 161s # urllib3.request. It also calls makefile (recv) on the socket. 161s try: 161s conn.request( 161s method, 161s url, 161s body=body, 161s headers=headers, 161s chunked=chunked, 161s preload_content=preload_content, 161s decode_content=decode_content, 161s enforce_content_length=enforce_content_length, 161s ) 161s 161s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 161s # legitimately able to close the connection after sending a valid response. 161s # With this behaviour, the received response is still readable. 161s except BrokenPipeError: 161s pass 161s except OSError as e: 161s # MacOS/Linux 161s # EPROTOTYPE and ECONNRESET are needed on macOS 161s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 161s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 161s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 161s raise 161s 161s # Reset the timeout for the recv() on the socket 161s read_timeout = timeout_obj.read_timeout 161s 161s if not conn.is_closed: 161s # In Python 3 socket.py will catch EAGAIN and return None when you 161s # try and read into the file pointer created by http.client, which 161s # instead raises a BadStatusLine exception. Instead of catching 161s # the exception and assuming all BadStatusLine exceptions are read 161s # timeouts, check for a zero timeout before making the request. 161s if read_timeout == 0: 161s raise ReadTimeoutError( 161s self, url, f"Read timed out. (read timeout={read_timeout})" 161s ) 161s conn.timeout = read_timeout 161s 161s # Receive the response from the server 161s try: 161s response = conn.getresponse() 161s except (BaseSSLError, OSError) as e: 161s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 161s raise 161s 161s # Set properties that are used by the pooling layer. 161s response.retries = retries 161s response._connection = response_conn # type: ignore[attr-defined] 161s response._pool = self # type: ignore[attr-defined] 161s 161s log.debug( 161s '%s://%s:%s "%s %s %s" %s %s', 161s self.scheme, 161s self.host, 161s self.port, 161s method, 161s url, 161s > response.version_string, 161s response.status, 161s response.length_remaining, 161s ) 161s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 161s 161s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 161s ----------------------------- Captured stderr call ----------------------------- 161s 127.0.0.1 - - [18/Jan/2025 02:32:12] "GET /cookies/set?k1=v1&k2=v2 HTTP/1.1" 302 203 161s =============================== warnings summary =============================== 161s tests/integration/test_config.py:10 161s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_config.py:10: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 161s @pytest.mark.online 161s 161s tests/integration/test_config.py:24 161s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_config.py:24: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 161s @pytest.mark.online 161s 161s tests/integration/test_config.py:34 161s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_config.py:34: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 161s @pytest.mark.online 161s 161s tests/integration/test_config.py:47 161s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_config.py:47: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 161s @pytest.mark.online 161s 161s tests/integration/test_config.py:69 161s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_config.py:69: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 161s @pytest.mark.online 161s 161s tests/integration/test_disksaver.py:14 161s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_disksaver.py:14: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 161s @pytest.mark.online 161s 161s tests/integration/test_disksaver.py:35 161s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_disksaver.py:35: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 161s @pytest.mark.online 161s 161s tests/integration/test_httplib2.py:60 161s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_httplib2.py:60: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 161s @pytest.mark.online 161s 161s tests/integration/test_register_matcher.py:16 161s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_register_matcher.py:16: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 161s @pytest.mark.online 161s 161s tests/integration/test_register_matcher.py:32 161s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_register_matcher.py:32: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 161s @pytest.mark.online 161s 161s tests/integration/test_urllib2.py:60 161s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_urllib2.py:60: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 161s @mark.online 161s 161s tests/integration/test_urllib3.py:102 161s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_urllib3.py:102: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 161s @pytest.mark.online 161s 161s tests/integration/test_wild.py:55 161s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_wild.py:55: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 161s @pytest.mark.online 161s 161s tests/integration/test_wild.py:74 161s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/integration/test_wild.py:74: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 161s @pytest.mark.online 161s 161s tests/unit/test_stubs.py:20 161s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/unit/test_stubs.py:20: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 161s @mark.online 161s 161s tests/unit/test_unittest.py:131 161s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/unit/test_unittest.py:131: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 161s @pytest.mark.online 161s 161s tests/unit/test_unittest.py:166 161s /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build/tests/unit/test_unittest.py:166: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 161s @pytest.mark.online 161s 161s tests/integration/test_wild.py::test_xmlrpclib 161s /usr/lib/python3.12/multiprocessing/popen_fork.py:66: DeprecationWarning: This process (pid=2912) is multi-threaded, use of fork() may lead to deadlocks in the child. 161s self.pid = os.fork() 161s 161s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 161s =========================== short test summary info ============================ 161s FAILED tests/integration/test_urllib3.py::test_status_code[http] - AttributeE... 161s FAILED tests/integration/test_urllib3.py::test_headers[http] - AttributeError... 161s FAILED tests/integration/test_urllib3.py::test_body[http] - AttributeError: '... 161s FAILED tests/integration/test_urllib3.py::test_auth[http] - AttributeError: '... 161s FAILED tests/integration/test_urllib3.py::test_auth_failed[http] - AttributeE... 161s FAILED tests/integration/test_urllib3.py::test_post[http] - AttributeError: '... 161s FAILED tests/integration/test_urllib3.py::test_gzip[http] - AttributeError: '... 161s FAILED tests/integration/test_urllib3.py::test_status_code[https] - Attribute... 161s FAILED tests/integration/test_urllib3.py::test_headers[https] - AttributeErro... 161s FAILED tests/integration/test_urllib3.py::test_body[https] - AttributeError: ... 161s FAILED tests/integration/test_urllib3.py::test_auth[https] - AttributeError: ... 161s FAILED tests/integration/test_urllib3.py::test_auth_failed[https] - Attribute... 161s FAILED tests/integration/test_urllib3.py::test_post[https] - AttributeError: ... 161s FAILED tests/integration/test_urllib3.py::test_gzip[https] - AttributeError: ... 161s FAILED tests/integration/test_proxy.py::test_use_proxy - AttributeError: 'VCR... 161s FAILED tests/integration/test_urllib3.py::test_cross_scheme - AttributeError:... 161s FAILED tests/integration/test_urllib3.py::test_https_with_cert_validation_disabled 161s FAILED tests/integration/test_wild.py::test_domain_redirect - AttributeError:... 161s FAILED tests/integration/test_wild.py::test_cookies - AttributeError: 'VCRHTT... 161s ==== 19 failed, 265 passed, 3 skipped, 19 deselected, 18 warnings in 3.24s ===== 161s E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd /tmp/autopkgtest.ZTKQrF/autopkgtest_tmp/build; python3.12 -m pytest --ignore tests/integration/test_aiohttp.py --ignore tests/integration/test_tornado.py --ignore tests/integration/test_requests.py -m "not online" -k "not test_basic_json_use and not test_load_cassette_with_custom_persister" 161s pybuild-autopkgtest: error: pybuild --autopkgtest --test-pytest -i python{version} -p "3.13 3.12" returned exit code 13 161s make: *** [/tmp/ldfYJwG_L4/run:4: pybuild-autopkgtest] Error 25 161s pybuild-autopkgtest: error: /tmp/ldfYJwG_L4/run pybuild-autopkgtest returned exit code 2 162s autopkgtest [02:32:14]: test pybuild-autopkgtest: -----------------------] 162s pybuild-autopkgtest FAIL non-zero exit status 25 162s autopkgtest [02:32:14]: test pybuild-autopkgtest: - - - - - - - - - - results - - - - - - - - - - 163s autopkgtest [02:32:15]: @@@@@@@@@@@@@@@@@@@@ summary 163s pybuild-autopkgtest FAIL non-zero exit status 25 167s nova [W] Using flock in prodstack6-s390x 167s flock: timeout while waiting to get lock 167s Creating nova instance adt-plucky-s390x-vcr.py-20250118-022932-juju-7f2275-prod-proposed-migration-environment-15-cbba265f-67fa-419e-b994-77d1a78e18a7 from image adt/ubuntu-plucky-s390x-server-20250117.img (UUID 77043c30-ce55-43c6-ae27-2590a56e9de9)... 167s nova [W] Timed out waiting for af5a1f80-ee8e-4c9b-bb97-e19587adf415 to get deleted.