0s autopkgtest [02:27:50]: starting date and time: 2025-01-18 02:27:50+0000 0s autopkgtest [02:27:50]: git checkout: 325255d2 Merge branch 'pin-any-arch' into 'ubuntu/production' 0s autopkgtest [02:27:50]: host juju-7f2275-prod-proposed-migration-environment-15; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.p12u13zn/out --timeout-copy=6000 --setup-commands /home/ubuntu/autopkgtest-cloud/worker-config-production/setup-canonical.sh --apt-pocket=proposed=src:python-urllib3 --apt-upgrade vcr.py --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 --env=ADT_TEST_TRIGGERS=python-urllib3/2.3.0-1 -- ssh -s /home/ubuntu/autopkgtest/ssh-setup/nova -- --flavor autopkgtest --security-groups autopkgtest-juju-7f2275-prod-proposed-migration-environment-15@bos03-arm64-33.secgroup --name adt-plucky-arm64-vcr.py-20250118-022750-juju-7f2275-prod-proposed-migration-environment-15-f0f15dcd-3476-4b12-86bb-11581c936de7 --image adt/ubuntu-plucky-arm64-server --keyname testbed-juju-7f2275-prod-proposed-migration-environment-15 --net-id=net_prod-proposed-migration -e TERM=linux -e ''"'"'http_proxy=http://squid.internal:3128'"'"'' -e ''"'"'https_proxy=http://squid.internal:3128'"'"'' -e ''"'"'no_proxy=127.0.0.1,127.0.1.1,login.ubuntu.com,localhost,localdomain,novalocal,internal,archive.ubuntu.com,ports.ubuntu.com,security.ubuntu.com,ddebs.ubuntu.com,changelogs.ubuntu.com,keyserver.ubuntu.com,launchpadlibrarian.net,launchpadcontent.net,launchpad.net,10.24.0.0/24,keystone.ps5.canonical.com,objectstorage.prodstack5.canonical.com,radosgw.ps5.canonical.com'"'"'' --mirror=http://ftpmaster.internal/ubuntu/ 129s autopkgtest [02:29:59]: testbed dpkg architecture: arm64 130s autopkgtest [02:30:00]: testbed apt version: 2.9.18 130s autopkgtest [02:30:00]: @@@@@@@@@@@@@@@@@@@@ test bed setup 130s autopkgtest [02:30:00]: testbed release detected to be: None 131s autopkgtest [02:30:01]: updating testbed package index (apt update) 131s Get:1 http://ftpmaster.internal/ubuntu plucky-proposed InRelease [73.9 kB] 132s Hit:2 http://ftpmaster.internal/ubuntu plucky InRelease 132s Hit:3 http://ftpmaster.internal/ubuntu plucky-updates InRelease 132s Hit:4 http://ftpmaster.internal/ubuntu plucky-security InRelease 132s Get:5 http://ftpmaster.internal/ubuntu plucky-proposed/main Sources [156 kB] 132s Get:6 http://ftpmaster.internal/ubuntu plucky-proposed/universe Sources [838 kB] 132s Get:7 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse Sources [15.3 kB] 132s Get:8 http://ftpmaster.internal/ubuntu plucky-proposed/restricted Sources [9708 B] 132s Get:9 http://ftpmaster.internal/ubuntu plucky-proposed/main arm64 Packages [284 kB] 132s Get:10 http://ftpmaster.internal/ubuntu plucky-proposed/restricted arm64 Packages [57.8 kB] 132s Get:11 http://ftpmaster.internal/ubuntu plucky-proposed/universe arm64 Packages [986 kB] 132s Get:12 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse arm64 Packages [10.8 kB] 132s Fetched 2431 kB in 1s (2417 kB/s) 133s Reading package lists... 134s Reading package lists... 134s Building dependency tree... 134s Reading state information... 135s Calculating upgrade... 135s The following packages will be upgraded: 135s gcc-14-base libatomic1 libgcc-s1 libgudev-1.0-0 libstdc++6 python3-certifi 135s python3-chardet python3-jwt rng-tools-debian usb.ids 135s 10 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 135s Need to get 1303 kB of archives. 135s After this operation, 0 B of additional disk space will be used. 135s Get:1 http://ftpmaster.internal/ubuntu plucky/universe arm64 rng-tools-debian arm64 2.6 [44.3 kB] 135s Get:2 http://ftpmaster.internal/ubuntu plucky/main arm64 libatomic1 arm64 14.2.0-13ubuntu1 [11.5 kB] 135s Get:3 http://ftpmaster.internal/ubuntu plucky/main arm64 gcc-14-base arm64 14.2.0-13ubuntu1 [53.0 kB] 135s Get:4 http://ftpmaster.internal/ubuntu plucky/main arm64 libstdc++6 arm64 14.2.0-13ubuntu1 [748 kB] 136s Get:5 http://ftpmaster.internal/ubuntu plucky/main arm64 libgcc-s1 arm64 14.2.0-13ubuntu1 [61.8 kB] 136s Get:6 http://ftpmaster.internal/ubuntu plucky/main arm64 usb.ids all 2025.01.14-1 [223 kB] 136s Get:7 http://ftpmaster.internal/ubuntu plucky/main arm64 libgudev-1.0-0 arm64 1:238-6 [14.9 kB] 136s Get:8 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-certifi all 2024.12.14+ds-1 [9800 B] 136s Get:9 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-chardet all 5.2.0+dfsg-2 [116 kB] 136s Get:10 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-jwt all 2.10.1-2 [21.0 kB] 136s Fetched 1303 kB in 1s (2064 kB/s) 136s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 80204 files and directories currently installed.) 136s Preparing to unpack .../rng-tools-debian_2.6_arm64.deb ... 136s Unpacking rng-tools-debian (2.6) over (2.5) ... 137s Preparing to unpack .../libatomic1_14.2.0-13ubuntu1_arm64.deb ... 137s Unpacking libatomic1:arm64 (14.2.0-13ubuntu1) over (14.2.0-12ubuntu1) ... 137s Preparing to unpack .../gcc-14-base_14.2.0-13ubuntu1_arm64.deb ... 137s Unpacking gcc-14-base:arm64 (14.2.0-13ubuntu1) over (14.2.0-12ubuntu1) ... 137s Setting up gcc-14-base:arm64 (14.2.0-13ubuntu1) ... 137s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 80204 files and directories currently installed.) 137s Preparing to unpack .../libstdc++6_14.2.0-13ubuntu1_arm64.deb ... 137s Unpacking libstdc++6:arm64 (14.2.0-13ubuntu1) over (14.2.0-12ubuntu1) ... 137s Setting up libstdc++6:arm64 (14.2.0-13ubuntu1) ... 137s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 80204 files and directories currently installed.) 137s Preparing to unpack .../libgcc-s1_14.2.0-13ubuntu1_arm64.deb ... 137s Unpacking libgcc-s1:arm64 (14.2.0-13ubuntu1) over (14.2.0-12ubuntu1) ... 137s Setting up libgcc-s1:arm64 (14.2.0-13ubuntu1) ... 137s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 80204 files and directories currently installed.) 137s Preparing to unpack .../usb.ids_2025.01.14-1_all.deb ... 137s Unpacking usb.ids (2025.01.14-1) over (2024.12.04-1) ... 137s Preparing to unpack .../libgudev-1.0-0_1%3a238-6_arm64.deb ... 137s Unpacking libgudev-1.0-0:arm64 (1:238-6) over (1:238-5ubuntu1) ... 137s Preparing to unpack .../python3-certifi_2024.12.14+ds-1_all.deb ... 137s Unpacking python3-certifi (2024.12.14+ds-1) over (2024.8.30+dfsg-1) ... 137s Preparing to unpack .../python3-chardet_5.2.0+dfsg-2_all.deb ... 137s Unpacking python3-chardet (5.2.0+dfsg-2) over (5.2.0+dfsg-1) ... 137s Preparing to unpack .../python3-jwt_2.10.1-2_all.deb ... 138s Unpacking python3-jwt (2.10.1-2) over (2.7.0-1) ... 138s Setting up python3-jwt (2.10.1-2) ... 138s Setting up python3-chardet (5.2.0+dfsg-2) ... 138s Setting up python3-certifi (2024.12.14+ds-1) ... 138s Setting up rng-tools-debian (2.6) ... 139s Setting up libatomic1:arm64 (14.2.0-13ubuntu1) ... 139s Setting up usb.ids (2025.01.14-1) ... 139s Setting up libgudev-1.0-0:arm64 (1:238-6) ... 139s Processing triggers for man-db (2.13.0-1) ... 140s Processing triggers for libc-bin (2.40-4ubuntu1) ... 140s Reading package lists... 141s Building dependency tree... 141s Reading state information... 141s 0 upgraded, 0 newly installed, 0 to remove and 1 not upgraded. 141s autopkgtest [02:30:11]: upgrading testbed (apt dist-upgrade and autopurge) 142s Reading package lists... 142s Building dependency tree... 142s Reading state information... 142s Calculating upgrade...Starting pkgProblemResolver with broken count: 0 143s Starting 2 pkgProblemResolver with broken count: 0 143s Done 143s Entering ResolveByKeep 144s 144s The following packages will be upgraded: 144s python3-urllib3 144s 1 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 144s Need to get 94.0 kB of archives. 144s After this operation, 18.4 kB of additional disk space will be used. 144s Get:1 http://ftpmaster.internal/ubuntu plucky-proposed/main arm64 python3-urllib3 all 2.3.0-1 [94.0 kB] 145s Fetched 94.0 kB in 0s (337 kB/s) 145s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 80201 files and directories currently installed.) 145s Preparing to unpack .../python3-urllib3_2.3.0-1_all.deb ... 145s Unpacking python3-urllib3 (2.3.0-1) over (2.0.7-2ubuntu0.1) ... 145s Setting up python3-urllib3 (2.3.0-1) ... 145s Reading package lists... 146s Building dependency tree... 146s Reading state information... 146s Starting pkgProblemResolver with broken count: 0 146s Starting 2 pkgProblemResolver with broken count: 0 146s Done 147s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 147s autopkgtest [02:30:17]: rebooting testbed after setup commands that affected boot 171s autopkgtest [02:30:41]: testbed running kernel: Linux 6.11.0-8-generic #8-Ubuntu SMP PREEMPT_DYNAMIC Mon Sep 16 14:19:41 UTC 2024 174s autopkgtest [02:30:44]: @@@@@@@@@@@@@@@@@@@@ apt-source vcr.py 177s Get:1 http://ftpmaster.internal/ubuntu plucky/universe vcr.py 6.0.2-2 (dsc) [2977 B] 177s Get:2 http://ftpmaster.internal/ubuntu plucky/universe vcr.py 6.0.2-2 (tar) [339 kB] 177s Get:3 http://ftpmaster.internal/ubuntu plucky/universe vcr.py 6.0.2-2 (diff) [6348 B] 177s gpgv: Signature made Tue Dec 17 14:55:48 2024 UTC 177s gpgv: using RSA key AC0A4FF12611B6FCCF01C111393587D97D86500B 177s gpgv: Can't check signature: No public key 177s dpkg-source: warning: cannot verify inline signature for ./vcr.py_6.0.2-2.dsc: no acceptable signature found 177s autopkgtest [02:30:47]: testing package vcr.py version 6.0.2-2 177s autopkgtest [02:30:47]: build not needed 178s autopkgtest [02:30:48]: test pybuild-autopkgtest: preparing testbed 178s Reading package lists... 178s Building dependency tree... 178s Reading state information... 179s Starting pkgProblemResolver with broken count: 0 179s Starting 2 pkgProblemResolver with broken count: 0 179s Done 180s The following NEW packages will be installed: 180s autoconf automake autopoint autotools-dev build-essential cpp cpp-14 180s cpp-14-aarch64-linux-gnu cpp-aarch64-linux-gnu debhelper debugedit 180s dh-autoreconf dh-python dh-strip-nondeterminism docutils-common dwz 180s fonts-font-awesome fonts-lato g++ g++-14 g++-14-aarch64-linux-gnu 180s g++-aarch64-linux-gnu gcc gcc-14 gcc-14-aarch64-linux-gnu 180s gcc-aarch64-linux-gnu gettext intltool-debian libarchive-zip-perl libasan8 180s libcc1-0 libdebhelper-perl libfile-stripnondeterminism-perl libgcc-14-dev 180s libgomp1 libhwasan0 libisl23 libitm1 libjs-jquery libjs-sphinxdoc 180s libjs-underscore libjson-perl liblsan0 liblua5.4-0 libmpc3 180s libpython3.13-minimal libpython3.13-stdlib libstdc++-14-dev libtool libtsan2 180s libubsan1 m4 pandoc pandoc-data po-debconf pybuild-plugin-autopkgtest 180s python-vcr-doc python3-aiohappyeyeballs python3-aiohttp python3-aiosignal 180s python3-alabaster python3-all python3-async-timeout python3-boto3 180s python3-botocore python3-brotli python3-brotlicffi python3-click 180s python3-dateutil python3-decorator python3-defusedxml python3-docutils 180s python3-flasgger python3-flask python3-frozenlist python3-greenlet 180s python3-httpbin python3-imagesize python3-iniconfig python3-itsdangerous 180s python3-jmespath python3-mistune python3-multidict python3-packaging 180s python3-pluggy python3-pytest python3-pytest-httpbin python3-pytest-tornado 180s python3-roman python3-s3transfer python3-six python3-snowballstemmer 180s python3-sphinx python3-sphinx-rtd-theme python3-sphinxcontrib.jquery 180s python3-tornado python3-vcr python3-werkzeug python3-wrapt python3-yarl 180s python3.13 python3.13-minimal sgml-base sphinx-common 180s sphinx-rtd-theme-common xml-core 180s 0 upgraded, 106 newly installed, 0 to remove and 0 not upgraded. 180s Need to get 116 MB of archives. 180s After this operation, 606 MB of additional disk space will be used. 180s Get:1 http://ftpmaster.internal/ubuntu plucky/main arm64 fonts-lato all 2.015-1 [2781 kB] 181s Get:2 http://ftpmaster.internal/ubuntu plucky/main arm64 libpython3.13-minimal arm64 3.13.1-2 [879 kB] 182s Get:3 http://ftpmaster.internal/ubuntu plucky/main arm64 python3.13-minimal arm64 3.13.1-2 [2262 kB] 182s Get:4 http://ftpmaster.internal/ubuntu plucky/main arm64 sgml-base all 1.31 [11.4 kB] 182s Get:5 http://ftpmaster.internal/ubuntu plucky/main arm64 m4 arm64 1.4.19-4build1 [240 kB] 182s Get:6 http://ftpmaster.internal/ubuntu plucky/main arm64 autoconf all 2.72-3 [382 kB] 182s Get:7 http://ftpmaster.internal/ubuntu plucky/main arm64 autotools-dev all 20220109.1 [44.9 kB] 182s Get:8 http://ftpmaster.internal/ubuntu plucky/main arm64 automake all 1:1.16.5-1.3ubuntu1 [558 kB] 182s Get:9 http://ftpmaster.internal/ubuntu plucky/main arm64 autopoint all 0.22.5-3 [616 kB] 183s Get:10 http://ftpmaster.internal/ubuntu plucky/main arm64 libisl23 arm64 0.27-1 [676 kB] 183s Get:11 http://ftpmaster.internal/ubuntu plucky/main arm64 libmpc3 arm64 1.3.1-1build2 [56.8 kB] 183s Get:12 http://ftpmaster.internal/ubuntu plucky/main arm64 cpp-14-aarch64-linux-gnu arm64 14.2.0-13ubuntu1 [10.6 MB] 185s Get:13 http://ftpmaster.internal/ubuntu plucky/main arm64 cpp-14 arm64 14.2.0-13ubuntu1 [1030 B] 185s Get:14 http://ftpmaster.internal/ubuntu plucky/main arm64 cpp-aarch64-linux-gnu arm64 4:14.1.0-2ubuntu1 [5452 B] 185s Get:15 http://ftpmaster.internal/ubuntu plucky/main arm64 cpp arm64 4:14.1.0-2ubuntu1 [22.5 kB] 185s Get:16 http://ftpmaster.internal/ubuntu plucky/main arm64 libcc1-0 arm64 14.2.0-13ubuntu1 [49.6 kB] 185s Get:17 http://ftpmaster.internal/ubuntu plucky/main arm64 libgomp1 arm64 14.2.0-13ubuntu1 [145 kB] 185s Get:18 http://ftpmaster.internal/ubuntu plucky/main arm64 libitm1 arm64 14.2.0-13ubuntu1 [27.8 kB] 185s Get:19 http://ftpmaster.internal/ubuntu plucky/main arm64 libasan8 arm64 14.2.0-13ubuntu1 [2893 kB] 186s Get:20 http://ftpmaster.internal/ubuntu plucky/main arm64 liblsan0 arm64 14.2.0-13ubuntu1 [1283 kB] 186s Get:21 http://ftpmaster.internal/ubuntu plucky/main arm64 libtsan2 arm64 14.2.0-13ubuntu1 [2686 kB] 187s Get:22 http://ftpmaster.internal/ubuntu plucky/main arm64 libubsan1 arm64 14.2.0-13ubuntu1 [1152 kB] 187s Get:23 http://ftpmaster.internal/ubuntu plucky/main arm64 libhwasan0 arm64 14.2.0-13ubuntu1 [1598 kB] 187s Get:24 http://ftpmaster.internal/ubuntu plucky/main arm64 libgcc-14-dev arm64 14.2.0-13ubuntu1 [2596 kB] 188s Get:25 http://ftpmaster.internal/ubuntu plucky/main arm64 gcc-14-aarch64-linux-gnu arm64 14.2.0-13ubuntu1 [20.9 MB] 192s Get:26 http://ftpmaster.internal/ubuntu plucky/main arm64 gcc-14 arm64 14.2.0-13ubuntu1 [523 kB] 192s Get:27 http://ftpmaster.internal/ubuntu plucky/main arm64 gcc-aarch64-linux-gnu arm64 4:14.1.0-2ubuntu1 [1200 B] 192s Get:28 http://ftpmaster.internal/ubuntu plucky/main arm64 gcc arm64 4:14.1.0-2ubuntu1 [4994 B] 192s Get:29 http://ftpmaster.internal/ubuntu plucky/main arm64 libstdc++-14-dev arm64 14.2.0-13ubuntu1 [2502 kB] 192s Get:30 http://ftpmaster.internal/ubuntu plucky/main arm64 g++-14-aarch64-linux-gnu arm64 14.2.0-13ubuntu1 [12.1 MB] 194s Get:31 http://ftpmaster.internal/ubuntu plucky/main arm64 g++-14 arm64 14.2.0-13ubuntu1 [21.1 kB] 194s Get:32 http://ftpmaster.internal/ubuntu plucky/main arm64 g++-aarch64-linux-gnu arm64 4:14.1.0-2ubuntu1 [958 B] 194s Get:33 http://ftpmaster.internal/ubuntu plucky/main arm64 g++ arm64 4:14.1.0-2ubuntu1 [1080 B] 194s Get:34 http://ftpmaster.internal/ubuntu plucky/main arm64 build-essential arm64 12.10ubuntu1 [4932 B] 194s Get:35 http://ftpmaster.internal/ubuntu plucky/main arm64 libdebhelper-perl all 13.20ubuntu1 [94.2 kB] 194s Get:36 http://ftpmaster.internal/ubuntu plucky/main arm64 libtool all 2.4.7-8 [166 kB] 194s Get:37 http://ftpmaster.internal/ubuntu plucky/main arm64 dh-autoreconf all 20 [16.1 kB] 194s Get:38 http://ftpmaster.internal/ubuntu plucky/main arm64 libarchive-zip-perl all 1.68-1 [90.2 kB] 194s Get:39 http://ftpmaster.internal/ubuntu plucky/main arm64 libfile-stripnondeterminism-perl all 1.14.0-1 [20.1 kB] 194s Get:40 http://ftpmaster.internal/ubuntu plucky/main arm64 dh-strip-nondeterminism all 1.14.0-1 [5058 B] 194s Get:41 http://ftpmaster.internal/ubuntu plucky/main arm64 debugedit arm64 1:5.1-1 [45.9 kB] 194s Get:42 http://ftpmaster.internal/ubuntu plucky/main arm64 dwz arm64 0.15-1build6 [113 kB] 194s Get:43 http://ftpmaster.internal/ubuntu plucky/main arm64 gettext arm64 0.22.5-3 [932 kB] 194s Get:44 http://ftpmaster.internal/ubuntu plucky/main arm64 intltool-debian all 0.35.0+20060710.6 [23.2 kB] 194s Get:45 http://ftpmaster.internal/ubuntu plucky/main arm64 po-debconf all 1.0.21+nmu1 [233 kB] 194s Get:46 http://ftpmaster.internal/ubuntu plucky/main arm64 debhelper all 13.20ubuntu1 [893 kB] 194s Get:47 http://ftpmaster.internal/ubuntu plucky/universe arm64 dh-python all 6.20241217 [117 kB] 194s Get:48 http://ftpmaster.internal/ubuntu plucky/main arm64 xml-core all 0.19 [20.3 kB] 194s Get:49 http://ftpmaster.internal/ubuntu plucky/main arm64 docutils-common all 0.21.2+dfsg-2 [131 kB] 194s Get:50 http://ftpmaster.internal/ubuntu plucky/main arm64 fonts-font-awesome all 5.0.10+really4.7.0~dfsg-4.1 [516 kB] 195s Get:51 http://ftpmaster.internal/ubuntu plucky/main arm64 libjs-jquery all 3.6.1+dfsg+~3.5.14-1 [328 kB] 195s Get:52 http://ftpmaster.internal/ubuntu plucky/main arm64 libjs-underscore all 1.13.4~dfsg+~1.11.4-3 [118 kB] 195s Get:53 http://ftpmaster.internal/ubuntu plucky/main arm64 libjs-sphinxdoc all 8.1.3-3 [30.9 kB] 195s Get:54 http://ftpmaster.internal/ubuntu plucky/main arm64 libjson-perl all 4.10000-1 [81.9 kB] 195s Get:55 http://ftpmaster.internal/ubuntu plucky/main arm64 liblua5.4-0 arm64 5.4.7-1 [158 kB] 195s Get:56 http://ftpmaster.internal/ubuntu plucky/main arm64 libpython3.13-stdlib arm64 3.13.1-2 [2061 kB] 195s Get:57 http://ftpmaster.internal/ubuntu plucky/universe arm64 pandoc-data all 3.1.11.1-3build1 [78.8 kB] 195s Get:58 http://ftpmaster.internal/ubuntu plucky/universe arm64 pandoc arm64 3.1.11.1+ds-2 [28.1 MB] 198s Get:59 http://ftpmaster.internal/ubuntu plucky/universe arm64 pybuild-plugin-autopkgtest all 6.20241217 [1746 B] 198s Get:60 http://ftpmaster.internal/ubuntu plucky/universe arm64 python-vcr-doc all 6.0.2-2 [184 kB] 198s Get:61 http://ftpmaster.internal/ubuntu plucky/universe arm64 python3-aiohappyeyeballs all 2.4.4-2 [10.6 kB] 198s Get:62 http://ftpmaster.internal/ubuntu plucky/universe arm64 python3-multidict arm64 6.1.0-1build1 [38.3 kB] 198s Get:63 http://ftpmaster.internal/ubuntu plucky/universe arm64 python3-yarl arm64 1.13.1-1build1 [110 kB] 198s Get:64 http://ftpmaster.internal/ubuntu plucky/universe arm64 python3-async-timeout all 5.0.1-1 [6830 B] 198s Get:65 http://ftpmaster.internal/ubuntu plucky/universe arm64 python3-frozenlist arm64 1.5.0-1build1 [59.0 kB] 198s Get:66 http://ftpmaster.internal/ubuntu plucky/universe arm64 python3-aiosignal all 1.3.2-1 [5182 B] 198s Get:67 http://ftpmaster.internal/ubuntu plucky/universe arm64 python3-aiohttp arm64 3.10.11-1 [334 kB] 198s Get:68 http://ftpmaster.internal/ubuntu plucky/main arm64 python3.13 arm64 3.13.1-2 [729 kB] 198s Get:69 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-all arm64 3.12.8-1 [892 B] 198s Get:70 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-dateutil all 2.9.0-3 [80.2 kB] 198s Get:71 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-jmespath all 1.0.1-1 [21.3 kB] 198s Get:72 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-six all 1.17.0-1 [13.2 kB] 198s Get:73 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-botocore all 1.34.46+repack-1ubuntu1 [6211 kB] 198s Get:74 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-s3transfer all 0.10.1-1ubuntu2 [54.3 kB] 198s Get:75 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-boto3 all 1.34.46+dfsg-1ubuntu1 [72.5 kB] 198s Get:76 http://ftpmaster.internal/ubuntu plucky/universe arm64 python3-brotli arm64 1.1.0-2build3 [342 kB] 198s Get:77 http://ftpmaster.internal/ubuntu plucky/universe arm64 python3-brotlicffi arm64 1.1.0.0+ds1-1 [18.6 kB] 198s Get:78 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-click all 8.1.8-1 [79.8 kB] 198s Get:79 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-decorator all 5.1.1-5 [10.1 kB] 198s Get:80 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-defusedxml all 0.7.1-3 [42.2 kB] 198s Get:81 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-roman all 4.2-1 [10.0 kB] 198s Get:82 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-docutils all 0.21.2+dfsg-2 [409 kB] 198s Get:83 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-itsdangerous all 2.2.0-1 [15.2 kB] 198s Get:84 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-werkzeug all 3.1.3-2 [169 kB] 198s Get:85 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-flask all 3.1.0-2ubuntu1 [84.4 kB] 198s Get:86 http://ftpmaster.internal/ubuntu plucky/universe arm64 python3-mistune all 3.0.2-2 [32.9 kB] 198s Get:87 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-packaging all 24.2-1 [51.5 kB] 198s Get:88 http://ftpmaster.internal/ubuntu plucky/universe arm64 python3-flasgger all 0.9.7.2~dev2+dfsg-3 [1693 kB] 199s Get:89 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-greenlet arm64 3.1.0-1 [173 kB] 199s Get:90 http://ftpmaster.internal/ubuntu plucky/universe arm64 python3-httpbin all 0.10.2+dfsg-2 [89.0 kB] 199s Get:91 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-imagesize all 1.4.1-1 [6844 B] 199s Get:92 http://ftpmaster.internal/ubuntu plucky/universe arm64 python3-iniconfig all 1.1.1-2 [6024 B] 199s Get:93 http://ftpmaster.internal/ubuntu plucky/universe arm64 python3-pluggy all 1.5.0-1 [21.0 kB] 199s Get:94 http://ftpmaster.internal/ubuntu plucky/universe arm64 python3-pytest all 8.3.4-1 [252 kB] 199s Get:95 http://ftpmaster.internal/ubuntu plucky/universe arm64 python3-pytest-httpbin all 2.1.0-1 [13.0 kB] 199s Get:96 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-tornado arm64 6.4.1-3 [299 kB] 199s Get:97 http://ftpmaster.internal/ubuntu plucky/universe arm64 python3-pytest-tornado all 0.8.1-3 [7180 B] 199s Get:98 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-snowballstemmer all 2.2.0-4build1 [59.8 kB] 199s Get:99 http://ftpmaster.internal/ubuntu plucky/main arm64 sphinx-common all 8.1.3-3 [661 kB] 199s Get:100 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-alabaster all 0.7.16-0.1 [18.5 kB] 199s Get:101 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-sphinx all 8.1.3-3 [474 kB] 199s Get:102 http://ftpmaster.internal/ubuntu plucky/main arm64 sphinx-rtd-theme-common all 3.0.2+dfsg-1 [1014 kB] 199s Get:103 http://ftpmaster.internal/ubuntu plucky/universe arm64 python3-sphinxcontrib.jquery all 4.1-5 [6678 B] 199s Get:104 http://ftpmaster.internal/ubuntu plucky/universe arm64 python3-sphinx-rtd-theme all 3.0.2+dfsg-1 [23.5 kB] 199s Get:105 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-wrapt arm64 1.15.0-4 [34.3 kB] 199s Get:106 http://ftpmaster.internal/ubuntu plucky/universe arm64 python3-vcr all 6.0.2-2 [33.0 kB] 200s Fetched 116 MB in 20s (5842 kB/s) 200s Selecting previously unselected package fonts-lato. 200s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 80207 files and directories currently installed.) 200s Preparing to unpack .../000-fonts-lato_2.015-1_all.deb ... 200s Unpacking fonts-lato (2.015-1) ... 200s Selecting previously unselected package libpython3.13-minimal:arm64. 200s Preparing to unpack .../001-libpython3.13-minimal_3.13.1-2_arm64.deb ... 200s Unpacking libpython3.13-minimal:arm64 (3.13.1-2) ... 200s Selecting previously unselected package python3.13-minimal. 200s Preparing to unpack .../002-python3.13-minimal_3.13.1-2_arm64.deb ... 200s Unpacking python3.13-minimal (3.13.1-2) ... 201s Selecting previously unselected package sgml-base. 201s Preparing to unpack .../003-sgml-base_1.31_all.deb ... 201s Unpacking sgml-base (1.31) ... 201s Selecting previously unselected package m4. 201s Preparing to unpack .../004-m4_1.4.19-4build1_arm64.deb ... 201s Unpacking m4 (1.4.19-4build1) ... 201s Selecting previously unselected package autoconf. 201s Preparing to unpack .../005-autoconf_2.72-3_all.deb ... 201s Unpacking autoconf (2.72-3) ... 201s Selecting previously unselected package autotools-dev. 201s Preparing to unpack .../006-autotools-dev_20220109.1_all.deb ... 201s Unpacking autotools-dev (20220109.1) ... 201s Selecting previously unselected package automake. 201s Preparing to unpack .../007-automake_1%3a1.16.5-1.3ubuntu1_all.deb ... 201s Unpacking automake (1:1.16.5-1.3ubuntu1) ... 201s Selecting previously unselected package autopoint. 201s Preparing to unpack .../008-autopoint_0.22.5-3_all.deb ... 201s Unpacking autopoint (0.22.5-3) ... 201s Selecting previously unselected package libisl23:arm64. 201s Preparing to unpack .../009-libisl23_0.27-1_arm64.deb ... 201s Unpacking libisl23:arm64 (0.27-1) ... 201s Selecting previously unselected package libmpc3:arm64. 201s Preparing to unpack .../010-libmpc3_1.3.1-1build2_arm64.deb ... 201s Unpacking libmpc3:arm64 (1.3.1-1build2) ... 201s Selecting previously unselected package cpp-14-aarch64-linux-gnu. 201s Preparing to unpack .../011-cpp-14-aarch64-linux-gnu_14.2.0-13ubuntu1_arm64.deb ... 201s Unpacking cpp-14-aarch64-linux-gnu (14.2.0-13ubuntu1) ... 201s Selecting previously unselected package cpp-14. 201s Preparing to unpack .../012-cpp-14_14.2.0-13ubuntu1_arm64.deb ... 201s Unpacking cpp-14 (14.2.0-13ubuntu1) ... 201s Selecting previously unselected package cpp-aarch64-linux-gnu. 201s Preparing to unpack .../013-cpp-aarch64-linux-gnu_4%3a14.1.0-2ubuntu1_arm64.deb ... 201s Unpacking cpp-aarch64-linux-gnu (4:14.1.0-2ubuntu1) ... 201s Selecting previously unselected package cpp. 201s Preparing to unpack .../014-cpp_4%3a14.1.0-2ubuntu1_arm64.deb ... 201s Unpacking cpp (4:14.1.0-2ubuntu1) ... 201s Selecting previously unselected package libcc1-0:arm64. 201s Preparing to unpack .../015-libcc1-0_14.2.0-13ubuntu1_arm64.deb ... 201s Unpacking libcc1-0:arm64 (14.2.0-13ubuntu1) ... 201s Selecting previously unselected package libgomp1:arm64. 201s Preparing to unpack .../016-libgomp1_14.2.0-13ubuntu1_arm64.deb ... 201s Unpacking libgomp1:arm64 (14.2.0-13ubuntu1) ... 201s Selecting previously unselected package libitm1:arm64. 201s Preparing to unpack .../017-libitm1_14.2.0-13ubuntu1_arm64.deb ... 201s Unpacking libitm1:arm64 (14.2.0-13ubuntu1) ... 201s Selecting previously unselected package libasan8:arm64. 201s Preparing to unpack .../018-libasan8_14.2.0-13ubuntu1_arm64.deb ... 201s Unpacking libasan8:arm64 (14.2.0-13ubuntu1) ... 201s Selecting previously unselected package liblsan0:arm64. 201s Preparing to unpack .../019-liblsan0_14.2.0-13ubuntu1_arm64.deb ... 201s Unpacking liblsan0:arm64 (14.2.0-13ubuntu1) ... 201s Selecting previously unselected package libtsan2:arm64. 201s Preparing to unpack .../020-libtsan2_14.2.0-13ubuntu1_arm64.deb ... 201s Unpacking libtsan2:arm64 (14.2.0-13ubuntu1) ... 202s Selecting previously unselected package libubsan1:arm64. 202s Preparing to unpack .../021-libubsan1_14.2.0-13ubuntu1_arm64.deb ... 202s Unpacking libubsan1:arm64 (14.2.0-13ubuntu1) ... 202s Selecting previously unselected package libhwasan0:arm64. 202s Preparing to unpack .../022-libhwasan0_14.2.0-13ubuntu1_arm64.deb ... 202s Unpacking libhwasan0:arm64 (14.2.0-13ubuntu1) ... 202s Selecting previously unselected package libgcc-14-dev:arm64. 202s Preparing to unpack .../023-libgcc-14-dev_14.2.0-13ubuntu1_arm64.deb ... 202s Unpacking libgcc-14-dev:arm64 (14.2.0-13ubuntu1) ... 202s Selecting previously unselected package gcc-14-aarch64-linux-gnu. 202s Preparing to unpack .../024-gcc-14-aarch64-linux-gnu_14.2.0-13ubuntu1_arm64.deb ... 202s Unpacking gcc-14-aarch64-linux-gnu (14.2.0-13ubuntu1) ... 202s Selecting previously unselected package gcc-14. 202s Preparing to unpack .../025-gcc-14_14.2.0-13ubuntu1_arm64.deb ... 202s Unpacking gcc-14 (14.2.0-13ubuntu1) ... 202s Selecting previously unselected package gcc-aarch64-linux-gnu. 202s Preparing to unpack .../026-gcc-aarch64-linux-gnu_4%3a14.1.0-2ubuntu1_arm64.deb ... 202s Unpacking gcc-aarch64-linux-gnu (4:14.1.0-2ubuntu1) ... 202s Selecting previously unselected package gcc. 202s Preparing to unpack .../027-gcc_4%3a14.1.0-2ubuntu1_arm64.deb ... 202s Unpacking gcc (4:14.1.0-2ubuntu1) ... 202s Selecting previously unselected package libstdc++-14-dev:arm64. 202s Preparing to unpack .../028-libstdc++-14-dev_14.2.0-13ubuntu1_arm64.deb ... 202s Unpacking libstdc++-14-dev:arm64 (14.2.0-13ubuntu1) ... 202s Selecting previously unselected package g++-14-aarch64-linux-gnu. 202s Preparing to unpack .../029-g++-14-aarch64-linux-gnu_14.2.0-13ubuntu1_arm64.deb ... 202s Unpacking g++-14-aarch64-linux-gnu (14.2.0-13ubuntu1) ... 203s Selecting previously unselected package g++-14. 203s Preparing to unpack .../030-g++-14_14.2.0-13ubuntu1_arm64.deb ... 203s Unpacking g++-14 (14.2.0-13ubuntu1) ... 203s Selecting previously unselected package g++-aarch64-linux-gnu. 203s Preparing to unpack .../031-g++-aarch64-linux-gnu_4%3a14.1.0-2ubuntu1_arm64.deb ... 203s Unpacking g++-aarch64-linux-gnu (4:14.1.0-2ubuntu1) ... 203s Selecting previously unselected package g++. 203s Preparing to unpack .../032-g++_4%3a14.1.0-2ubuntu1_arm64.deb ... 203s Unpacking g++ (4:14.1.0-2ubuntu1) ... 203s Selecting previously unselected package build-essential. 203s Preparing to unpack .../033-build-essential_12.10ubuntu1_arm64.deb ... 203s Unpacking build-essential (12.10ubuntu1) ... 203s Selecting previously unselected package libdebhelper-perl. 203s Preparing to unpack .../034-libdebhelper-perl_13.20ubuntu1_all.deb ... 203s Unpacking libdebhelper-perl (13.20ubuntu1) ... 203s Selecting previously unselected package libtool. 203s Preparing to unpack .../035-libtool_2.4.7-8_all.deb ... 203s Unpacking libtool (2.4.7-8) ... 203s Selecting previously unselected package dh-autoreconf. 203s Preparing to unpack .../036-dh-autoreconf_20_all.deb ... 203s Unpacking dh-autoreconf (20) ... 203s Selecting previously unselected package libarchive-zip-perl. 203s Preparing to unpack .../037-libarchive-zip-perl_1.68-1_all.deb ... 203s Unpacking libarchive-zip-perl (1.68-1) ... 203s Selecting previously unselected package libfile-stripnondeterminism-perl. 203s Preparing to unpack .../038-libfile-stripnondeterminism-perl_1.14.0-1_all.deb ... 203s Unpacking libfile-stripnondeterminism-perl (1.14.0-1) ... 203s Selecting previously unselected package dh-strip-nondeterminism. 203s Preparing to unpack .../039-dh-strip-nondeterminism_1.14.0-1_all.deb ... 203s Unpacking dh-strip-nondeterminism (1.14.0-1) ... 203s Selecting previously unselected package debugedit. 203s Preparing to unpack .../040-debugedit_1%3a5.1-1_arm64.deb ... 203s Unpacking debugedit (1:5.1-1) ... 203s Selecting previously unselected package dwz. 203s Preparing to unpack .../041-dwz_0.15-1build6_arm64.deb ... 203s Unpacking dwz (0.15-1build6) ... 203s Selecting previously unselected package gettext. 203s Preparing to unpack .../042-gettext_0.22.5-3_arm64.deb ... 203s Unpacking gettext (0.22.5-3) ... 203s Selecting previously unselected package intltool-debian. 203s Preparing to unpack .../043-intltool-debian_0.35.0+20060710.6_all.deb ... 203s Unpacking intltool-debian (0.35.0+20060710.6) ... 203s Selecting previously unselected package po-debconf. 203s Preparing to unpack .../044-po-debconf_1.0.21+nmu1_all.deb ... 203s Unpacking po-debconf (1.0.21+nmu1) ... 203s Selecting previously unselected package debhelper. 203s Preparing to unpack .../045-debhelper_13.20ubuntu1_all.deb ... 203s Unpacking debhelper (13.20ubuntu1) ... 203s Selecting previously unselected package dh-python. 203s Preparing to unpack .../046-dh-python_6.20241217_all.deb ... 203s Unpacking dh-python (6.20241217) ... 203s Selecting previously unselected package xml-core. 203s Preparing to unpack .../047-xml-core_0.19_all.deb ... 203s Unpacking xml-core (0.19) ... 203s Selecting previously unselected package docutils-common. 203s Preparing to unpack .../048-docutils-common_0.21.2+dfsg-2_all.deb ... 203s Unpacking docutils-common (0.21.2+dfsg-2) ... 204s Selecting previously unselected package fonts-font-awesome. 204s Preparing to unpack .../049-fonts-font-awesome_5.0.10+really4.7.0~dfsg-4.1_all.deb ... 204s Unpacking fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 204s Selecting previously unselected package libjs-jquery. 204s Preparing to unpack .../050-libjs-jquery_3.6.1+dfsg+~3.5.14-1_all.deb ... 204s Unpacking libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 204s Selecting previously unselected package libjs-underscore. 204s Preparing to unpack .../051-libjs-underscore_1.13.4~dfsg+~1.11.4-3_all.deb ... 204s Unpacking libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 204s Selecting previously unselected package libjs-sphinxdoc. 204s Preparing to unpack .../052-libjs-sphinxdoc_8.1.3-3_all.deb ... 204s Unpacking libjs-sphinxdoc (8.1.3-3) ... 204s Selecting previously unselected package libjson-perl. 204s Preparing to unpack .../053-libjson-perl_4.10000-1_all.deb ... 204s Unpacking libjson-perl (4.10000-1) ... 204s Selecting previously unselected package liblua5.4-0:arm64. 204s Preparing to unpack .../054-liblua5.4-0_5.4.7-1_arm64.deb ... 204s Unpacking liblua5.4-0:arm64 (5.4.7-1) ... 204s Selecting previously unselected package libpython3.13-stdlib:arm64. 204s Preparing to unpack .../055-libpython3.13-stdlib_3.13.1-2_arm64.deb ... 204s Unpacking libpython3.13-stdlib:arm64 (3.13.1-2) ... 204s Selecting previously unselected package pandoc-data. 204s Preparing to unpack .../056-pandoc-data_3.1.11.1-3build1_all.deb ... 204s Unpacking pandoc-data (3.1.11.1-3build1) ... 204s Selecting previously unselected package pandoc. 204s Preparing to unpack .../057-pandoc_3.1.11.1+ds-2_arm64.deb ... 204s Unpacking pandoc (3.1.11.1+ds-2) ... 205s Selecting previously unselected package pybuild-plugin-autopkgtest. 205s Preparing to unpack .../058-pybuild-plugin-autopkgtest_6.20241217_all.deb ... 205s Unpacking pybuild-plugin-autopkgtest (6.20241217) ... 205s Selecting previously unselected package python-vcr-doc. 205s Preparing to unpack .../059-python-vcr-doc_6.0.2-2_all.deb ... 205s Unpacking python-vcr-doc (6.0.2-2) ... 205s Selecting previously unselected package python3-aiohappyeyeballs. 205s Preparing to unpack .../060-python3-aiohappyeyeballs_2.4.4-2_all.deb ... 205s Unpacking python3-aiohappyeyeballs (2.4.4-2) ... 205s Selecting previously unselected package python3-multidict. 205s Preparing to unpack .../061-python3-multidict_6.1.0-1build1_arm64.deb ... 205s Unpacking python3-multidict (6.1.0-1build1) ... 205s Selecting previously unselected package python3-yarl. 205s Preparing to unpack .../062-python3-yarl_1.13.1-1build1_arm64.deb ... 205s Unpacking python3-yarl (1.13.1-1build1) ... 205s Selecting previously unselected package python3-async-timeout. 205s Preparing to unpack .../063-python3-async-timeout_5.0.1-1_all.deb ... 205s Unpacking python3-async-timeout (5.0.1-1) ... 205s Selecting previously unselected package python3-frozenlist. 205s Preparing to unpack .../064-python3-frozenlist_1.5.0-1build1_arm64.deb ... 205s Unpacking python3-frozenlist (1.5.0-1build1) ... 205s Selecting previously unselected package python3-aiosignal. 205s Preparing to unpack .../065-python3-aiosignal_1.3.2-1_all.deb ... 205s Unpacking python3-aiosignal (1.3.2-1) ... 205s Selecting previously unselected package python3-aiohttp. 205s Preparing to unpack .../066-python3-aiohttp_3.10.11-1_arm64.deb ... 205s Unpacking python3-aiohttp (3.10.11-1) ... 205s Selecting previously unselected package python3.13. 205s Preparing to unpack .../067-python3.13_3.13.1-2_arm64.deb ... 205s Unpacking python3.13 (3.13.1-2) ... 205s Selecting previously unselected package python3-all. 205s Preparing to unpack .../068-python3-all_3.12.8-1_arm64.deb ... 205s Unpacking python3-all (3.12.8-1) ... 205s Selecting previously unselected package python3-dateutil. 205s Preparing to unpack .../069-python3-dateutil_2.9.0-3_all.deb ... 205s Unpacking python3-dateutil (2.9.0-3) ... 205s Selecting previously unselected package python3-jmespath. 205s Preparing to unpack .../070-python3-jmespath_1.0.1-1_all.deb ... 205s Unpacking python3-jmespath (1.0.1-1) ... 205s Selecting previously unselected package python3-six. 205s Preparing to unpack .../071-python3-six_1.17.0-1_all.deb ... 205s Unpacking python3-six (1.17.0-1) ... 205s Selecting previously unselected package python3-botocore. 205s Preparing to unpack .../072-python3-botocore_1.34.46+repack-1ubuntu1_all.deb ... 205s Unpacking python3-botocore (1.34.46+repack-1ubuntu1) ... 206s Selecting previously unselected package python3-s3transfer. 206s Preparing to unpack .../073-python3-s3transfer_0.10.1-1ubuntu2_all.deb ... 206s Unpacking python3-s3transfer (0.10.1-1ubuntu2) ... 206s Selecting previously unselected package python3-boto3. 206s Preparing to unpack .../074-python3-boto3_1.34.46+dfsg-1ubuntu1_all.deb ... 206s Unpacking python3-boto3 (1.34.46+dfsg-1ubuntu1) ... 206s Selecting previously unselected package python3-brotli. 206s Preparing to unpack .../075-python3-brotli_1.1.0-2build3_arm64.deb ... 206s Unpacking python3-brotli (1.1.0-2build3) ... 206s Selecting previously unselected package python3-brotlicffi. 206s Preparing to unpack .../076-python3-brotlicffi_1.1.0.0+ds1-1_arm64.deb ... 206s Unpacking python3-brotlicffi (1.1.0.0+ds1-1) ... 206s Selecting previously unselected package python3-click. 206s Preparing to unpack .../077-python3-click_8.1.8-1_all.deb ... 206s Unpacking python3-click (8.1.8-1) ... 206s Selecting previously unselected package python3-decorator. 206s Preparing to unpack .../078-python3-decorator_5.1.1-5_all.deb ... 206s Unpacking python3-decorator (5.1.1-5) ... 206s Selecting previously unselected package python3-defusedxml. 206s Preparing to unpack .../079-python3-defusedxml_0.7.1-3_all.deb ... 206s Unpacking python3-defusedxml (0.7.1-3) ... 206s Selecting previously unselected package python3-roman. 206s Preparing to unpack .../080-python3-roman_4.2-1_all.deb ... 206s Unpacking python3-roman (4.2-1) ... 206s Selecting previously unselected package python3-docutils. 206s Preparing to unpack .../081-python3-docutils_0.21.2+dfsg-2_all.deb ... 206s Unpacking python3-docutils (0.21.2+dfsg-2) ... 206s Selecting previously unselected package python3-itsdangerous. 206s Preparing to unpack .../082-python3-itsdangerous_2.2.0-1_all.deb ... 206s Unpacking python3-itsdangerous (2.2.0-1) ... 206s Selecting previously unselected package python3-werkzeug. 206s Preparing to unpack .../083-python3-werkzeug_3.1.3-2_all.deb ... 206s Unpacking python3-werkzeug (3.1.3-2) ... 206s Selecting previously unselected package python3-flask. 206s Preparing to unpack .../084-python3-flask_3.1.0-2ubuntu1_all.deb ... 206s Unpacking python3-flask (3.1.0-2ubuntu1) ... 206s Selecting previously unselected package python3-mistune. 206s Preparing to unpack .../085-python3-mistune_3.0.2-2_all.deb ... 206s Unpacking python3-mistune (3.0.2-2) ... 207s Selecting previously unselected package python3-packaging. 207s Preparing to unpack .../086-python3-packaging_24.2-1_all.deb ... 207s Unpacking python3-packaging (24.2-1) ... 207s Selecting previously unselected package python3-flasgger. 207s Preparing to unpack .../087-python3-flasgger_0.9.7.2~dev2+dfsg-3_all.deb ... 207s Unpacking python3-flasgger (0.9.7.2~dev2+dfsg-3) ... 207s Selecting previously unselected package python3-greenlet. 207s Preparing to unpack .../088-python3-greenlet_3.1.0-1_arm64.deb ... 207s Unpacking python3-greenlet (3.1.0-1) ... 207s Selecting previously unselected package python3-httpbin. 207s Preparing to unpack .../089-python3-httpbin_0.10.2+dfsg-2_all.deb ... 207s Unpacking python3-httpbin (0.10.2+dfsg-2) ... 207s Selecting previously unselected package python3-imagesize. 207s Preparing to unpack .../090-python3-imagesize_1.4.1-1_all.deb ... 207s Unpacking python3-imagesize (1.4.1-1) ... 207s Selecting previously unselected package python3-iniconfig. 207s Preparing to unpack .../091-python3-iniconfig_1.1.1-2_all.deb ... 207s Unpacking python3-iniconfig (1.1.1-2) ... 207s Selecting previously unselected package python3-pluggy. 207s Preparing to unpack .../092-python3-pluggy_1.5.0-1_all.deb ... 207s Unpacking python3-pluggy (1.5.0-1) ... 207s Selecting previously unselected package python3-pytest. 207s Preparing to unpack .../093-python3-pytest_8.3.4-1_all.deb ... 207s Unpacking python3-pytest (8.3.4-1) ... 207s Selecting previously unselected package python3-pytest-httpbin. 207s Preparing to unpack .../094-python3-pytest-httpbin_2.1.0-1_all.deb ... 207s Unpacking python3-pytest-httpbin (2.1.0-1) ... 207s Selecting previously unselected package python3-tornado. 207s Preparing to unpack .../095-python3-tornado_6.4.1-3_arm64.deb ... 207s Unpacking python3-tornado (6.4.1-3) ... 207s Selecting previously unselected package python3-pytest-tornado. 207s Preparing to unpack .../096-python3-pytest-tornado_0.8.1-3_all.deb ... 207s Unpacking python3-pytest-tornado (0.8.1-3) ... 207s Selecting previously unselected package python3-snowballstemmer. 207s Preparing to unpack .../097-python3-snowballstemmer_2.2.0-4build1_all.deb ... 207s Unpacking python3-snowballstemmer (2.2.0-4build1) ... 207s Selecting previously unselected package sphinx-common. 207s Preparing to unpack .../098-sphinx-common_8.1.3-3_all.deb ... 207s Unpacking sphinx-common (8.1.3-3) ... 207s Selecting previously unselected package python3-alabaster. 207s Preparing to unpack .../099-python3-alabaster_0.7.16-0.1_all.deb ... 207s Unpacking python3-alabaster (0.7.16-0.1) ... 207s Selecting previously unselected package python3-sphinx. 207s Preparing to unpack .../100-python3-sphinx_8.1.3-3_all.deb ... 207s Unpacking python3-sphinx (8.1.3-3) ... 207s Selecting previously unselected package sphinx-rtd-theme-common. 207s Preparing to unpack .../101-sphinx-rtd-theme-common_3.0.2+dfsg-1_all.deb ... 207s Unpacking sphinx-rtd-theme-common (3.0.2+dfsg-1) ... 207s Selecting previously unselected package python3-sphinxcontrib.jquery. 207s Preparing to unpack .../102-python3-sphinxcontrib.jquery_4.1-5_all.deb ... 207s Unpacking python3-sphinxcontrib.jquery (4.1-5) ... 207s Selecting previously unselected package python3-sphinx-rtd-theme. 207s Preparing to unpack .../103-python3-sphinx-rtd-theme_3.0.2+dfsg-1_all.deb ... 207s Unpacking python3-sphinx-rtd-theme (3.0.2+dfsg-1) ... 207s Selecting previously unselected package python3-wrapt. 207s Preparing to unpack .../104-python3-wrapt_1.15.0-4_arm64.deb ... 207s Unpacking python3-wrapt (1.15.0-4) ... 207s Selecting previously unselected package python3-vcr. 207s Preparing to unpack .../105-python3-vcr_6.0.2-2_all.deb ... 207s Unpacking python3-vcr (6.0.2-2) ... 207s Setting up dh-python (6.20241217) ... 208s Setting up python3-iniconfig (1.1.1-2) ... 208s Setting up python3-tornado (6.4.1-3) ... 209s Setting up python3-brotlicffi (1.1.0.0+ds1-1) ... 209s Setting up fonts-lato (2.015-1) ... 209s Setting up python3-defusedxml (0.7.1-3) ... 209s Setting up libarchive-zip-perl (1.68-1) ... 209s Setting up python3-alabaster (0.7.16-0.1) ... 209s Setting up libdebhelper-perl (13.20ubuntu1) ... 209s Setting up m4 (1.4.19-4build1) ... 209s Setting up python3-itsdangerous (2.2.0-1) ... 209s Setting up libgomp1:arm64 (14.2.0-13ubuntu1) ... 209s Setting up python3-click (8.1.8-1) ... 210s Setting up python3-multidict (6.1.0-1build1) ... 210s Setting up python3-frozenlist (1.5.0-1build1) ... 210s Setting up python3-aiosignal (1.3.2-1) ... 210s Setting up python3-async-timeout (5.0.1-1) ... 210s Setting up python3-six (1.17.0-1) ... 211s Setting up libpython3.13-minimal:arm64 (3.13.1-2) ... 211s Setting up python3-roman (4.2-1) ... 211s Setting up python3-decorator (5.1.1-5) ... 211s Setting up autotools-dev (20220109.1) ... 211s Setting up python3-packaging (24.2-1) ... 211s Setting up python3-snowballstemmer (2.2.0-4build1) ... 212s Setting up python3-werkzeug (3.1.3-2) ... 212s Setting up python3-jmespath (1.0.1-1) ... 213s Setting up python3-brotli (1.1.0-2build3) ... 213s Setting up python3-greenlet (3.1.0-1) ... 213s Setting up libmpc3:arm64 (1.3.1-1build2) ... 213s Setting up python3-wrapt (1.15.0-4) ... 213s Setting up autopoint (0.22.5-3) ... 213s Setting up python3-aiohappyeyeballs (2.4.4-2) ... 213s Setting up autoconf (2.72-3) ... 213s Setting up python3-pluggy (1.5.0-1) ... 214s Setting up libubsan1:arm64 (14.2.0-13ubuntu1) ... 214s Setting up dwz (0.15-1build6) ... 214s Setting up libhwasan0:arm64 (14.2.0-13ubuntu1) ... 214s Setting up libasan8:arm64 (14.2.0-13ubuntu1) ... 214s Setting up libjson-perl (4.10000-1) ... 214s Setting up debugedit (1:5.1-1) ... 214s Setting up liblua5.4-0:arm64 (5.4.7-1) ... 214s Setting up python3.13-minimal (3.13.1-2) ... 214s Setting up python3-dateutil (2.9.0-3) ... 215s Setting up sgml-base (1.31) ... 215s Setting up pandoc-data (3.1.11.1-3build1) ... 215s Setting up libtsan2:arm64 (14.2.0-13ubuntu1) ... 215s Setting up libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 215s Setting up libisl23:arm64 (0.27-1) ... 215s Setting up python3-yarl (1.13.1-1build1) ... 215s Setting up python3-mistune (3.0.2-2) ... 215s Setting up libpython3.13-stdlib:arm64 (3.13.1-2) ... 215s Setting up fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 215s Setting up sphinx-rtd-theme-common (3.0.2+dfsg-1) ... 215s Setting up libcc1-0:arm64 (14.2.0-13ubuntu1) ... 215s Setting up liblsan0:arm64 (14.2.0-13ubuntu1) ... 215s Setting up libitm1:arm64 (14.2.0-13ubuntu1) ... 215s Setting up libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 215s Setting up python3-imagesize (1.4.1-1) ... 215s Setting up automake (1:1.16.5-1.3ubuntu1) ... 215s update-alternatives: using /usr/bin/automake-1.16 to provide /usr/bin/automake (automake) in auto mode 215s Setting up libfile-stripnondeterminism-perl (1.14.0-1) ... 215s Setting up gettext (0.22.5-3) ... 215s Setting up python3.13 (3.13.1-2) ... 216s Setting up python3-pytest (8.3.4-1) ... 217s Setting up python3-flask (3.1.0-2ubuntu1) ... 217s Setting up python3-aiohttp (3.10.11-1) ... 218s Setting up python3-all (3.12.8-1) ... 218s Setting up intltool-debian (0.35.0+20060710.6) ... 218s Setting up pandoc (3.1.11.1+ds-2) ... 218s Setting up python3-pytest-tornado (0.8.1-3) ... 218s Setting up python3-botocore (1.34.46+repack-1ubuntu1) ... 219s Setting up python3-vcr (6.0.2-2) ... 219s Setting up libjs-sphinxdoc (8.1.3-3) ... 219s Setting up dh-strip-nondeterminism (1.14.0-1) ... 219s Setting up cpp-14-aarch64-linux-gnu (14.2.0-13ubuntu1) ... 219s Setting up xml-core (0.19) ... 219s Setting up libgcc-14-dev:arm64 (14.2.0-13ubuntu1) ... 219s Setting up libstdc++-14-dev:arm64 (14.2.0-13ubuntu1) ... 219s Setting up python-vcr-doc (6.0.2-2) ... 219s Setting up python3-flasgger (0.9.7.2~dev2+dfsg-3) ... 219s Setting up po-debconf (1.0.21+nmu1) ... 219s Setting up python3-s3transfer (0.10.1-1ubuntu2) ... 220s Setting up cpp-aarch64-linux-gnu (4:14.1.0-2ubuntu1) ... 220s Setting up sphinx-common (8.1.3-3) ... 220s Setting up python3-boto3 (1.34.46+dfsg-1ubuntu1) ... 220s Setting up cpp-14 (14.2.0-13ubuntu1) ... 220s Setting up python3-httpbin (0.10.2+dfsg-2) ... 220s Setting up cpp (4:14.1.0-2ubuntu1) ... 220s Setting up python3-pytest-httpbin (2.1.0-1) ... 220s Setting up gcc-14-aarch64-linux-gnu (14.2.0-13ubuntu1) ... 220s Setting up gcc-aarch64-linux-gnu (4:14.1.0-2ubuntu1) ... 220s Setting up g++-14-aarch64-linux-gnu (14.2.0-13ubuntu1) ... 220s Setting up gcc-14 (14.2.0-13ubuntu1) ... 220s Setting up g++-aarch64-linux-gnu (4:14.1.0-2ubuntu1) ... 220s Setting up g++-14 (14.2.0-13ubuntu1) ... 220s Setting up libtool (2.4.7-8) ... 220s Setting up gcc (4:14.1.0-2ubuntu1) ... 220s Setting up dh-autoreconf (20) ... 220s Setting up g++ (4:14.1.0-2ubuntu1) ... 220s update-alternatives: using /usr/bin/g++ to provide /usr/bin/c++ (c++) in auto mode 220s Setting up build-essential (12.10ubuntu1) ... 220s Setting up debhelper (13.20ubuntu1) ... 220s Setting up pybuild-plugin-autopkgtest (6.20241217) ... 220s Processing triggers for install-info (7.1.1-1) ... 220s Processing triggers for libc-bin (2.40-4ubuntu1) ... 220s Processing triggers for systemd (257-2ubuntu1) ... 220s Processing triggers for man-db (2.13.0-1) ... 222s Processing triggers for sgml-base (1.31) ... 222s Setting up docutils-common (0.21.2+dfsg-2) ... 222s Processing triggers for sgml-base (1.31) ... 222s Setting up python3-docutils (0.21.2+dfsg-2) ... 223s Setting up python3-sphinx (8.1.3-3) ... 224s Setting up python3-sphinxcontrib.jquery (4.1-5) ... 225s Setting up python3-sphinx-rtd-theme (3.0.2+dfsg-1) ... 226s autopkgtest [02:31:36]: test pybuild-autopkgtest: pybuild-autopkgtest 226s autopkgtest [02:31:36]: test pybuild-autopkgtest: [----------------------- 226s pybuild-autopkgtest 227s I: pybuild base:311: cd /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build; python3.13 -m pytest --ignore tests/integration/test_aiohttp.py --ignore tests/integration/test_tornado.py --ignore tests/integration/test_requests.py -m "not online" -k "not test_basic_json_use and not test_load_cassette_with_custom_persister" 229s ============================= test session starts ============================== 229s platform linux -- Python 3.13.1, pytest-8.3.4, pluggy-1.5.0 229s rootdir: /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build 229s plugins: tornado-0.8.1, httpbin-2.1.0, typeguard-4.4.1 229s collected 305 items / 19 deselected / 1 skipped / 286 selected 229s 229s tests/integration/test_basic.py .... [ 1%] 229s tests/integration/test_boto3.py ss [ 2%] 229s tests/integration/test_config.py . [ 2%] 229s tests/integration/test_filter.py .......... [ 5%] 229s tests/integration/test_httplib2.py ........ [ 8%] 230s tests/integration/test_urllib2.py ........ [ 11%] 230s tests/integration/test_urllib3.py FFFFFFF [ 13%] 230s tests/integration/test_httplib2.py ........ [ 16%] 230s tests/integration/test_urllib2.py ........ [ 19%] 230s tests/integration/test_urllib3.py FFFFFFF [ 22%] 230s tests/integration/test_httplib2.py . [ 22%] 231s tests/integration/test_ignore.py .... [ 23%] 231s tests/integration/test_matchers.py .............. [ 28%] 231s tests/integration/test_multiple.py . [ 29%] 231s tests/integration/test_proxy.py F [ 29%] 231s tests/integration/test_record_mode.py ........ [ 32%] 231s tests/integration/test_register_persister.py .. [ 32%] 231s tests/integration/test_register_serializer.py . [ 33%] 231s tests/integration/test_request.py .. [ 33%] 231s tests/integration/test_stubs.py .... [ 35%] 231s tests/integration/test_urllib2.py . [ 35%] 231s tests/integration/test_urllib3.py FF. [ 36%] 231s tests/integration/test_wild.py F.F. [ 38%] 231s tests/unit/test_cassettes.py ............................... [ 48%] 231s tests/unit/test_errors.py .... [ 50%] 231s tests/unit/test_filters.py ........................ [ 58%] 231s tests/unit/test_json_serializer.py . [ 59%] 231s tests/unit/test_matchers.py ............................ [ 68%] 231s tests/unit/test_migration.py ... [ 69%] 231s tests/unit/test_persist.py .... [ 71%] 231s tests/unit/test_request.py ................. [ 77%] 231s tests/unit/test_response.py .... [ 78%] 231s tests/unit/test_serialize.py ............... [ 83%] 232s tests/unit/test_stubs.py ... [ 84%] 232s tests/unit/test_unittest.py ....... [ 87%] 232s tests/unit/test_util.py ........... [ 91%] 232s tests/unit/test_vcr.py ........................ [ 99%] 233s tests/unit/test_vcr_import.py . [100%] 233s 233s =================================== FAILURES =================================== 233s ____________________________ test_status_code[http] ____________________________ 233s 233s httpbin_both = 233s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_status_code_http_0') 233s verify_pool_mgr = 233s 233s def test_status_code(httpbin_both, tmpdir, verify_pool_mgr): 233s """Ensure that we can read the status code""" 233s url = httpbin_both.url 233s with vcr.use_cassette(str(tmpdir.join("atts.yaml"))): 233s > status_code = verify_pool_mgr.request("GET", url).status 233s 233s tests/integration/test_urllib3.py:34: 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 233s return self.request_encode_url( 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 233s return self.urlopen(method, url, **extra_kw) 233s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 233s response = conn.urlopen(method, u.request_uri, **kw) 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 233s response = self._make_request( 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s 233s self = 233s conn = 233s method = 'GET', url = '/', body = None, headers = {} 233s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 233s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 233s chunked = False, response_conn = None, preload_content = True 233s decode_content = True, enforce_content_length = True 233s 233s def _make_request( 233s self, 233s conn: BaseHTTPConnection, 233s method: str, 233s url: str, 233s body: _TYPE_BODY | None = None, 233s headers: typing.Mapping[str, str] | None = None, 233s retries: Retry | None = None, 233s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 233s chunked: bool = False, 233s response_conn: BaseHTTPConnection | None = None, 233s preload_content: bool = True, 233s decode_content: bool = True, 233s enforce_content_length: bool = True, 233s ) -> BaseHTTPResponse: 233s """ 233s Perform a request on a given urllib connection object taken from our 233s pool. 233s 233s :param conn: 233s a connection from one of our connection pools 233s 233s :param method: 233s HTTP request method (such as GET, POST, PUT, etc.) 233s 233s :param url: 233s The URL to perform the request on. 233s 233s :param body: 233s Data to send in the request body, either :class:`str`, :class:`bytes`, 233s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 233s 233s :param headers: 233s Dictionary of custom headers to send, such as User-Agent, 233s If-None-Match, etc. If None, pool headers are used. If provided, 233s these headers completely replace any pool-specific headers. 233s 233s :param retries: 233s Configure the number of retries to allow before raising a 233s :class:`~urllib3.exceptions.MaxRetryError` exception. 233s 233s Pass ``None`` to retry until you receive a response. Pass a 233s :class:`~urllib3.util.retry.Retry` object for fine-grained control 233s over different types of retries. 233s Pass an integer number to retry connection errors that many times, 233s but no other types of errors. Pass zero to never retry. 233s 233s If ``False``, then retries are disabled and any exception is raised 233s immediately. Also, instead of raising a MaxRetryError on redirects, 233s the redirect response will be returned. 233s 233s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 233s 233s :param timeout: 233s If specified, overrides the default timeout for this one 233s request. It may be a float (in seconds) or an instance of 233s :class:`urllib3.util.Timeout`. 233s 233s :param chunked: 233s If True, urllib3 will send the body using chunked transfer 233s encoding. Otherwise, urllib3 will send the body using the standard 233s content-length form. Defaults to False. 233s 233s :param response_conn: 233s Set this to ``None`` if you will handle releasing the connection or 233s set the connection to have the response release it. 233s 233s :param preload_content: 233s If True, the response's body will be preloaded during construction. 233s 233s :param decode_content: 233s If True, will attempt to decode the body based on the 233s 'content-encoding' header. 233s 233s :param enforce_content_length: 233s Enforce content length checking. Body returned by server must match 233s value of Content-Length header, if present. Otherwise, raise error. 233s """ 233s self.num_requests += 1 233s 233s timeout_obj = self._get_timeout(timeout) 233s timeout_obj.start_connect() 233s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 233s 233s try: 233s # Trigger any extra validation we need to do. 233s try: 233s self._validate_conn(conn) 233s except (SocketTimeout, BaseSSLError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 233s raise 233s 233s # _validate_conn() starts the connection to an HTTPS proxy 233s # so we need to wrap errors with 'ProxyError' here too. 233s except ( 233s OSError, 233s NewConnectionError, 233s TimeoutError, 233s BaseSSLError, 233s CertificateError, 233s SSLError, 233s ) as e: 233s new_e: Exception = e 233s if isinstance(e, (BaseSSLError, CertificateError)): 233s new_e = SSLError(e) 233s # If the connection didn't successfully connect to it's proxy 233s # then there 233s if isinstance( 233s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 233s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 233s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 233s raise new_e 233s 233s # conn.request() calls http.client.*.request, not the method in 233s # urllib3.request. It also calls makefile (recv) on the socket. 233s try: 233s conn.request( 233s method, 233s url, 233s body=body, 233s headers=headers, 233s chunked=chunked, 233s preload_content=preload_content, 233s decode_content=decode_content, 233s enforce_content_length=enforce_content_length, 233s ) 233s 233s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 233s # legitimately able to close the connection after sending a valid response. 233s # With this behaviour, the received response is still readable. 233s except BrokenPipeError: 233s pass 233s except OSError as e: 233s # MacOS/Linux 233s # EPROTOTYPE and ECONNRESET are needed on macOS 233s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 233s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 233s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 233s raise 233s 233s # Reset the timeout for the recv() on the socket 233s read_timeout = timeout_obj.read_timeout 233s 233s if not conn.is_closed: 233s # In Python 3 socket.py will catch EAGAIN and return None when you 233s # try and read into the file pointer created by http.client, which 233s # instead raises a BadStatusLine exception. Instead of catching 233s # the exception and assuming all BadStatusLine exceptions are read 233s # timeouts, check for a zero timeout before making the request. 233s if read_timeout == 0: 233s raise ReadTimeoutError( 233s self, url, f"Read timed out. (read timeout={read_timeout})" 233s ) 233s conn.timeout = read_timeout 233s 233s # Receive the response from the server 233s try: 233s response = conn.getresponse() 233s except (BaseSSLError, OSError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 233s raise 233s 233s # Set properties that are used by the pooling layer. 233s response.retries = retries 233s response._connection = response_conn # type: ignore[attr-defined] 233s response._pool = self # type: ignore[attr-defined] 233s 233s log.debug( 233s '%s://%s:%s "%s %s %s" %s %s', 233s self.scheme, 233s self.host, 233s self.port, 233s method, 233s url, 233s > response.version_string, 233s response.status, 233s response.length_remaining, 233s ) 233s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 233s 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 233s ----------------------------- Captured stderr call ----------------------------- 233s 127.0.0.1 - - [18/Jan/2025 02:31:39] "GET / HTTP/1.1" 200 9358 233s ______________________________ test_headers[http] ______________________________ 233s 233s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_headers_http_0') 233s httpbin_both = 233s verify_pool_mgr = 233s 233s def test_headers(tmpdir, httpbin_both, verify_pool_mgr): 233s """Ensure that we can read the headers back""" 233s url = httpbin_both.url 233s with vcr.use_cassette(str(tmpdir.join("headers.yaml"))): 233s > headers = verify_pool_mgr.request("GET", url).headers 233s 233s tests/integration/test_urllib3.py:44: 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 233s return self.request_encode_url( 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 233s return self.urlopen(method, url, **extra_kw) 233s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 233s response = conn.urlopen(method, u.request_uri, **kw) 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 233s response = self._make_request( 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s 233s self = 233s conn = 233s method = 'GET', url = '/', body = None, headers = {} 233s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 233s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 233s chunked = False, response_conn = None, preload_content = True 233s decode_content = True, enforce_content_length = True 233s 233s def _make_request( 233s self, 233s conn: BaseHTTPConnection, 233s method: str, 233s url: str, 233s body: _TYPE_BODY | None = None, 233s headers: typing.Mapping[str, str] | None = None, 233s retries: Retry | None = None, 233s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 233s chunked: bool = False, 233s response_conn: BaseHTTPConnection | None = None, 233s preload_content: bool = True, 233s decode_content: bool = True, 233s enforce_content_length: bool = True, 233s ) -> BaseHTTPResponse: 233s """ 233s Perform a request on a given urllib connection object taken from our 233s pool. 233s 233s :param conn: 233s a connection from one of our connection pools 233s 233s :param method: 233s HTTP request method (such as GET, POST, PUT, etc.) 233s 233s :param url: 233s The URL to perform the request on. 233s 233s :param body: 233s Data to send in the request body, either :class:`str`, :class:`bytes`, 233s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 233s 233s :param headers: 233s Dictionary of custom headers to send, such as User-Agent, 233s If-None-Match, etc. If None, pool headers are used. If provided, 233s these headers completely replace any pool-specific headers. 233s 233s :param retries: 233s Configure the number of retries to allow before raising a 233s :class:`~urllib3.exceptions.MaxRetryError` exception. 233s 233s Pass ``None`` to retry until you receive a response. Pass a 233s :class:`~urllib3.util.retry.Retry` object for fine-grained control 233s over different types of retries. 233s Pass an integer number to retry connection errors that many times, 233s but no other types of errors. Pass zero to never retry. 233s 233s If ``False``, then retries are disabled and any exception is raised 233s immediately. Also, instead of raising a MaxRetryError on redirects, 233s the redirect response will be returned. 233s 233s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 233s 233s :param timeout: 233s If specified, overrides the default timeout for this one 233s request. It may be a float (in seconds) or an instance of 233s :class:`urllib3.util.Timeout`. 233s 233s :param chunked: 233s If True, urllib3 will send the body using chunked transfer 233s encoding. Otherwise, urllib3 will send the body using the standard 233s content-length form. Defaults to False. 233s 233s :param response_conn: 233s Set this to ``None`` if you will handle releasing the connection or 233s set the connection to have the response release it. 233s 233s :param preload_content: 233s If True, the response's body will be preloaded during construction. 233s 233s :param decode_content: 233s If True, will attempt to decode the body based on the 233s 'content-encoding' header. 233s 233s :param enforce_content_length: 233s Enforce content length checking. Body returned by server must match 233s value of Content-Length header, if present. Otherwise, raise error. 233s """ 233s self.num_requests += 1 233s 233s timeout_obj = self._get_timeout(timeout) 233s timeout_obj.start_connect() 233s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 233s 233s try: 233s # Trigger any extra validation we need to do. 233s try: 233s self._validate_conn(conn) 233s except (SocketTimeout, BaseSSLError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 233s raise 233s 233s # _validate_conn() starts the connection to an HTTPS proxy 233s # so we need to wrap errors with 'ProxyError' here too. 233s except ( 233s OSError, 233s NewConnectionError, 233s TimeoutError, 233s BaseSSLError, 233s CertificateError, 233s SSLError, 233s ) as e: 233s new_e: Exception = e 233s if isinstance(e, (BaseSSLError, CertificateError)): 233s new_e = SSLError(e) 233s # If the connection didn't successfully connect to it's proxy 233s # then there 233s if isinstance( 233s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 233s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 233s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 233s raise new_e 233s 233s # conn.request() calls http.client.*.request, not the method in 233s # urllib3.request. It also calls makefile (recv) on the socket. 233s try: 233s conn.request( 233s method, 233s url, 233s body=body, 233s headers=headers, 233s chunked=chunked, 233s preload_content=preload_content, 233s decode_content=decode_content, 233s enforce_content_length=enforce_content_length, 233s ) 233s 233s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 233s # legitimately able to close the connection after sending a valid response. 233s # With this behaviour, the received response is still readable. 233s except BrokenPipeError: 233s pass 233s except OSError as e: 233s # MacOS/Linux 233s # EPROTOTYPE and ECONNRESET are needed on macOS 233s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 233s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 233s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 233s raise 233s 233s # Reset the timeout for the recv() on the socket 233s read_timeout = timeout_obj.read_timeout 233s 233s if not conn.is_closed: 233s # In Python 3 socket.py will catch EAGAIN and return None when you 233s # try and read into the file pointer created by http.client, which 233s # instead raises a BadStatusLine exception. Instead of catching 233s # the exception and assuming all BadStatusLine exceptions are read 233s # timeouts, check for a zero timeout before making the request. 233s if read_timeout == 0: 233s raise ReadTimeoutError( 233s self, url, f"Read timed out. (read timeout={read_timeout})" 233s ) 233s conn.timeout = read_timeout 233s 233s # Receive the response from the server 233s try: 233s response = conn.getresponse() 233s except (BaseSSLError, OSError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 233s raise 233s 233s # Set properties that are used by the pooling layer. 233s response.retries = retries 233s response._connection = response_conn # type: ignore[attr-defined] 233s response._pool = self # type: ignore[attr-defined] 233s 233s log.debug( 233s '%s://%s:%s "%s %s %s" %s %s', 233s self.scheme, 233s self.host, 233s self.port, 233s method, 233s url, 233s > response.version_string, 233s response.status, 233s response.length_remaining, 233s ) 233s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 233s 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 233s ----------------------------- Captured stderr call ----------------------------- 233s 127.0.0.1 - - [18/Jan/2025 02:31:40] "GET / HTTP/1.1" 200 9358 233s _______________________________ test_body[http] ________________________________ 233s 233s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_body_http_0') 233s httpbin_both = 233s verify_pool_mgr = 233s 233s def test_body(tmpdir, httpbin_both, verify_pool_mgr): 233s """Ensure the responses are all identical enough""" 233s url = httpbin_both.url + "/bytes/1024" 233s with vcr.use_cassette(str(tmpdir.join("body.yaml"))): 233s > content = verify_pool_mgr.request("GET", url).data 233s 233s tests/integration/test_urllib3.py:55: 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 233s return self.request_encode_url( 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 233s return self.urlopen(method, url, **extra_kw) 233s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 233s response = conn.urlopen(method, u.request_uri, **kw) 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 233s response = self._make_request( 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s 233s self = 233s conn = 233s method = 'GET', url = '/bytes/1024', body = None, headers = {} 233s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 233s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 233s chunked = False, response_conn = None, preload_content = True 233s decode_content = True, enforce_content_length = True 233s 233s def _make_request( 233s self, 233s conn: BaseHTTPConnection, 233s method: str, 233s url: str, 233s body: _TYPE_BODY | None = None, 233s headers: typing.Mapping[str, str] | None = None, 233s retries: Retry | None = None, 233s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 233s chunked: bool = False, 233s response_conn: BaseHTTPConnection | None = None, 233s preload_content: bool = True, 233s decode_content: bool = True, 233s enforce_content_length: bool = True, 233s ) -> BaseHTTPResponse: 233s """ 233s Perform a request on a given urllib connection object taken from our 233s pool. 233s 233s :param conn: 233s a connection from one of our connection pools 233s 233s :param method: 233s HTTP request method (such as GET, POST, PUT, etc.) 233s 233s :param url: 233s The URL to perform the request on. 233s 233s :param body: 233s Data to send in the request body, either :class:`str`, :class:`bytes`, 233s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 233s 233s :param headers: 233s Dictionary of custom headers to send, such as User-Agent, 233s If-None-Match, etc. If None, pool headers are used. If provided, 233s these headers completely replace any pool-specific headers. 233s 233s :param retries: 233s Configure the number of retries to allow before raising a 233s :class:`~urllib3.exceptions.MaxRetryError` exception. 233s 233s Pass ``None`` to retry until you receive a response. Pass a 233s :class:`~urllib3.util.retry.Retry` object for fine-grained control 233s over different types of retries. 233s Pass an integer number to retry connection errors that many times, 233s but no other types of errors. Pass zero to never retry. 233s 233s If ``False``, then retries are disabled and any exception is raised 233s immediately. Also, instead of raising a MaxRetryError on redirects, 233s the redirect response will be returned. 233s 233s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 233s 233s :param timeout: 233s If specified, overrides the default timeout for this one 233s request. It may be a float (in seconds) or an instance of 233s :class:`urllib3.util.Timeout`. 233s 233s :param chunked: 233s If True, urllib3 will send the body using chunked transfer 233s encoding. Otherwise, urllib3 will send the body using the standard 233s content-length form. Defaults to False. 233s 233s :param response_conn: 233s Set this to ``None`` if you will handle releasing the connection or 233s set the connection to have the response release it. 233s 233s :param preload_content: 233s If True, the response's body will be preloaded during construction. 233s 233s :param decode_content: 233s If True, will attempt to decode the body based on the 233s 'content-encoding' header. 233s 233s :param enforce_content_length: 233s Enforce content length checking. Body returned by server must match 233s value of Content-Length header, if present. Otherwise, raise error. 233s """ 233s self.num_requests += 1 233s 233s timeout_obj = self._get_timeout(timeout) 233s timeout_obj.start_connect() 233s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 233s 233s try: 233s # Trigger any extra validation we need to do. 233s try: 233s self._validate_conn(conn) 233s except (SocketTimeout, BaseSSLError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 233s raise 233s 233s # _validate_conn() starts the connection to an HTTPS proxy 233s # so we need to wrap errors with 'ProxyError' here too. 233s except ( 233s OSError, 233s NewConnectionError, 233s TimeoutError, 233s BaseSSLError, 233s CertificateError, 233s SSLError, 233s ) as e: 233s new_e: Exception = e 233s if isinstance(e, (BaseSSLError, CertificateError)): 233s new_e = SSLError(e) 233s # If the connection didn't successfully connect to it's proxy 233s # then there 233s if isinstance( 233s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 233s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 233s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 233s raise new_e 233s 233s # conn.request() calls http.client.*.request, not the method in 233s # urllib3.request. It also calls makefile (recv) on the socket. 233s try: 233s conn.request( 233s method, 233s url, 233s body=body, 233s headers=headers, 233s chunked=chunked, 233s preload_content=preload_content, 233s decode_content=decode_content, 233s enforce_content_length=enforce_content_length, 233s ) 233s 233s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 233s # legitimately able to close the connection after sending a valid response. 233s # With this behaviour, the received response is still readable. 233s except BrokenPipeError: 233s pass 233s except OSError as e: 233s # MacOS/Linux 233s # EPROTOTYPE and ECONNRESET are needed on macOS 233s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 233s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 233s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 233s raise 233s 233s # Reset the timeout for the recv() on the socket 233s read_timeout = timeout_obj.read_timeout 233s 233s if not conn.is_closed: 233s # In Python 3 socket.py will catch EAGAIN and return None when you 233s # try and read into the file pointer created by http.client, which 233s # instead raises a BadStatusLine exception. Instead of catching 233s # the exception and assuming all BadStatusLine exceptions are read 233s # timeouts, check for a zero timeout before making the request. 233s if read_timeout == 0: 233s raise ReadTimeoutError( 233s self, url, f"Read timed out. (read timeout={read_timeout})" 233s ) 233s conn.timeout = read_timeout 233s 233s # Receive the response from the server 233s try: 233s response = conn.getresponse() 233s except (BaseSSLError, OSError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 233s raise 233s 233s # Set properties that are used by the pooling layer. 233s response.retries = retries 233s response._connection = response_conn # type: ignore[attr-defined] 233s response._pool = self # type: ignore[attr-defined] 233s 233s log.debug( 233s '%s://%s:%s "%s %s %s" %s %s', 233s self.scheme, 233s self.host, 233s self.port, 233s method, 233s url, 233s > response.version_string, 233s response.status, 233s response.length_remaining, 233s ) 233s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 233s 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 233s ----------------------------- Captured stderr call ----------------------------- 233s 127.0.0.1 - - [18/Jan/2025 02:31:40] "GET /bytes/1024 HTTP/1.1" 200 1024 233s _______________________________ test_auth[http] ________________________________ 233s 233s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_auth_http_0') 233s httpbin_both = 233s verify_pool_mgr = 233s 233s def test_auth(tmpdir, httpbin_both, verify_pool_mgr): 233s """Ensure that we can handle basic auth""" 233s auth = ("user", "passwd") 233s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 233s url = httpbin_both.url + "/basic-auth/user/passwd" 233s with vcr.use_cassette(str(tmpdir.join("auth.yaml"))): 233s > one = verify_pool_mgr.request("GET", url, headers=headers) 233s 233s tests/integration/test_urllib3.py:67: 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 233s return self.request_encode_url( 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 233s return self.urlopen(method, url, **extra_kw) 233s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 233s response = conn.urlopen(method, u.request_uri, **kw) 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 233s response = self._make_request( 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s 233s self = 233s conn = 233s method = 'GET', url = '/basic-auth/user/passwd', body = None 233s headers = {'authorization': 'Basic dXNlcjpwYXNzd2Q='} 233s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 233s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 233s chunked = False, response_conn = None, preload_content = True 233s decode_content = True, enforce_content_length = True 233s 233s def _make_request( 233s self, 233s conn: BaseHTTPConnection, 233s method: str, 233s url: str, 233s body: _TYPE_BODY | None = None, 233s headers: typing.Mapping[str, str] | None = None, 233s retries: Retry | None = None, 233s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 233s chunked: bool = False, 233s response_conn: BaseHTTPConnection | None = None, 233s preload_content: bool = True, 233s decode_content: bool = True, 233s enforce_content_length: bool = True, 233s ) -> BaseHTTPResponse: 233s """ 233s Perform a request on a given urllib connection object taken from our 233s pool. 233s 233s :param conn: 233s a connection from one of our connection pools 233s 233s :param method: 233s HTTP request method (such as GET, POST, PUT, etc.) 233s 233s :param url: 233s The URL to perform the request on. 233s 233s :param body: 233s Data to send in the request body, either :class:`str`, :class:`bytes`, 233s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 233s 233s :param headers: 233s Dictionary of custom headers to send, such as User-Agent, 233s If-None-Match, etc. If None, pool headers are used. If provided, 233s these headers completely replace any pool-specific headers. 233s 233s :param retries: 233s Configure the number of retries to allow before raising a 233s :class:`~urllib3.exceptions.MaxRetryError` exception. 233s 233s Pass ``None`` to retry until you receive a response. Pass a 233s :class:`~urllib3.util.retry.Retry` object for fine-grained control 233s over different types of retries. 233s Pass an integer number to retry connection errors that many times, 233s but no other types of errors. Pass zero to never retry. 233s 233s If ``False``, then retries are disabled and any exception is raised 233s immediately. Also, instead of raising a MaxRetryError on redirects, 233s the redirect response will be returned. 233s 233s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 233s 233s :param timeout: 233s If specified, overrides the default timeout for this one 233s request. It may be a float (in seconds) or an instance of 233s :class:`urllib3.util.Timeout`. 233s 233s :param chunked: 233s If True, urllib3 will send the body using chunked transfer 233s encoding. Otherwise, urllib3 will send the body using the standard 233s content-length form. Defaults to False. 233s 233s :param response_conn: 233s Set this to ``None`` if you will handle releasing the connection or 233s set the connection to have the response release it. 233s 233s :param preload_content: 233s If True, the response's body will be preloaded during construction. 233s 233s :param decode_content: 233s If True, will attempt to decode the body based on the 233s 'content-encoding' header. 233s 233s :param enforce_content_length: 233s Enforce content length checking. Body returned by server must match 233s value of Content-Length header, if present. Otherwise, raise error. 233s """ 233s self.num_requests += 1 233s 233s timeout_obj = self._get_timeout(timeout) 233s timeout_obj.start_connect() 233s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 233s 233s try: 233s # Trigger any extra validation we need to do. 233s try: 233s self._validate_conn(conn) 233s except (SocketTimeout, BaseSSLError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 233s raise 233s 233s # _validate_conn() starts the connection to an HTTPS proxy 233s # so we need to wrap errors with 'ProxyError' here too. 233s except ( 233s OSError, 233s NewConnectionError, 233s TimeoutError, 233s BaseSSLError, 233s CertificateError, 233s SSLError, 233s ) as e: 233s new_e: Exception = e 233s if isinstance(e, (BaseSSLError, CertificateError)): 233s new_e = SSLError(e) 233s # If the connection didn't successfully connect to it's proxy 233s # then there 233s if isinstance( 233s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 233s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 233s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 233s raise new_e 233s 233s # conn.request() calls http.client.*.request, not the method in 233s # urllib3.request. It also calls makefile (recv) on the socket. 233s try: 233s conn.request( 233s method, 233s url, 233s body=body, 233s headers=headers, 233s chunked=chunked, 233s preload_content=preload_content, 233s decode_content=decode_content, 233s enforce_content_length=enforce_content_length, 233s ) 233s 233s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 233s # legitimately able to close the connection after sending a valid response. 233s # With this behaviour, the received response is still readable. 233s except BrokenPipeError: 233s pass 233s except OSError as e: 233s # MacOS/Linux 233s # EPROTOTYPE and ECONNRESET are needed on macOS 233s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 233s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 233s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 233s raise 233s 233s # Reset the timeout for the recv() on the socket 233s read_timeout = timeout_obj.read_timeout 233s 233s if not conn.is_closed: 233s # In Python 3 socket.py will catch EAGAIN and return None when you 233s # try and read into the file pointer created by http.client, which 233s # instead raises a BadStatusLine exception. Instead of catching 233s # the exception and assuming all BadStatusLine exceptions are read 233s # timeouts, check for a zero timeout before making the request. 233s if read_timeout == 0: 233s raise ReadTimeoutError( 233s self, url, f"Read timed out. (read timeout={read_timeout})" 233s ) 233s conn.timeout = read_timeout 233s 233s # Receive the response from the server 233s try: 233s response = conn.getresponse() 233s except (BaseSSLError, OSError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 233s raise 233s 233s # Set properties that are used by the pooling layer. 233s response.retries = retries 233s response._connection = response_conn # type: ignore[attr-defined] 233s response._pool = self # type: ignore[attr-defined] 233s 233s log.debug( 233s '%s://%s:%s "%s %s %s" %s %s', 233s self.scheme, 233s self.host, 233s self.port, 233s method, 233s url, 233s > response.version_string, 233s response.status, 233s response.length_remaining, 233s ) 233s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 233s 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 233s ----------------------------- Captured stderr call ----------------------------- 233s 127.0.0.1 - - [18/Jan/2025 02:31:40] "GET /basic-auth/user/passwd HTTP/1.1" 200 46 233s ____________________________ test_auth_failed[http] ____________________________ 233s 233s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_auth_failed_http_0') 233s httpbin_both = 233s verify_pool_mgr = 233s 233s def test_auth_failed(tmpdir, httpbin_both, verify_pool_mgr): 233s """Ensure that we can save failed auth statuses""" 233s auth = ("user", "wrongwrongwrong") 233s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 233s url = httpbin_both.url + "/basic-auth/user/passwd" 233s with vcr.use_cassette(str(tmpdir.join("auth-failed.yaml"))) as cass: 233s # Ensure that this is empty to begin with 233s assert_cassette_empty(cass) 233s > one = verify_pool_mgr.request("GET", url, headers=headers) 233s 233s tests/integration/test_urllib3.py:83: 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 233s return self.request_encode_url( 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 233s return self.urlopen(method, url, **extra_kw) 233s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 233s response = conn.urlopen(method, u.request_uri, **kw) 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 233s response = self._make_request( 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s 233s self = 233s conn = 233s method = 'GET', url = '/basic-auth/user/passwd', body = None 233s headers = {'authorization': 'Basic dXNlcjp3cm9uZ3dyb25nd3Jvbmc='} 233s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 233s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 233s chunked = False, response_conn = None, preload_content = True 233s decode_content = True, enforce_content_length = True 233s 233s def _make_request( 233s self, 233s conn: BaseHTTPConnection, 233s method: str, 233s url: str, 233s body: _TYPE_BODY | None = None, 233s headers: typing.Mapping[str, str] | None = None, 233s retries: Retry | None = None, 233s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 233s chunked: bool = False, 233s response_conn: BaseHTTPConnection | None = None, 233s preload_content: bool = True, 233s decode_content: bool = True, 233s enforce_content_length: bool = True, 233s ) -> BaseHTTPResponse: 233s """ 233s Perform a request on a given urllib connection object taken from our 233s pool. 233s 233s :param conn: 233s a connection from one of our connection pools 233s 233s :param method: 233s HTTP request method (such as GET, POST, PUT, etc.) 233s 233s :param url: 233s The URL to perform the request on. 233s 233s :param body: 233s Data to send in the request body, either :class:`str`, :class:`bytes`, 233s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 233s 233s :param headers: 233s Dictionary of custom headers to send, such as User-Agent, 233s If-None-Match, etc. If None, pool headers are used. If provided, 233s these headers completely replace any pool-specific headers. 233s 233s :param retries: 233s Configure the number of retries to allow before raising a 233s :class:`~urllib3.exceptions.MaxRetryError` exception. 233s 233s Pass ``None`` to retry until you receive a response. Pass a 233s :class:`~urllib3.util.retry.Retry` object for fine-grained control 233s over different types of retries. 233s Pass an integer number to retry connection errors that many times, 233s but no other types of errors. Pass zero to never retry. 233s 233s If ``False``, then retries are disabled and any exception is raised 233s immediately. Also, instead of raising a MaxRetryError on redirects, 233s the redirect response will be returned. 233s 233s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 233s 233s :param timeout: 233s If specified, overrides the default timeout for this one 233s request. It may be a float (in seconds) or an instance of 233s :class:`urllib3.util.Timeout`. 233s 233s :param chunked: 233s If True, urllib3 will send the body using chunked transfer 233s encoding. Otherwise, urllib3 will send the body using the standard 233s content-length form. Defaults to False. 233s 233s :param response_conn: 233s Set this to ``None`` if you will handle releasing the connection or 233s set the connection to have the response release it. 233s 233s :param preload_content: 233s If True, the response's body will be preloaded during construction. 233s 233s :param decode_content: 233s If True, will attempt to decode the body based on the 233s 'content-encoding' header. 233s 233s :param enforce_content_length: 233s Enforce content length checking. Body returned by server must match 233s value of Content-Length header, if present. Otherwise, raise error. 233s """ 233s self.num_requests += 1 233s 233s timeout_obj = self._get_timeout(timeout) 233s timeout_obj.start_connect() 233s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 233s 233s try: 233s # Trigger any extra validation we need to do. 233s try: 233s self._validate_conn(conn) 233s except (SocketTimeout, BaseSSLError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 233s raise 233s 233s # _validate_conn() starts the connection to an HTTPS proxy 233s # so we need to wrap errors with 'ProxyError' here too. 233s except ( 233s OSError, 233s NewConnectionError, 233s TimeoutError, 233s BaseSSLError, 233s CertificateError, 233s SSLError, 233s ) as e: 233s new_e: Exception = e 233s if isinstance(e, (BaseSSLError, CertificateError)): 233s new_e = SSLError(e) 233s # If the connection didn't successfully connect to it's proxy 233s # then there 233s if isinstance( 233s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 233s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 233s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 233s raise new_e 233s 233s # conn.request() calls http.client.*.request, not the method in 233s # urllib3.request. It also calls makefile (recv) on the socket. 233s try: 233s conn.request( 233s method, 233s url, 233s body=body, 233s headers=headers, 233s chunked=chunked, 233s preload_content=preload_content, 233s decode_content=decode_content, 233s enforce_content_length=enforce_content_length, 233s ) 233s 233s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 233s # legitimately able to close the connection after sending a valid response. 233s # With this behaviour, the received response is still readable. 233s except BrokenPipeError: 233s pass 233s except OSError as e: 233s # MacOS/Linux 233s # EPROTOTYPE and ECONNRESET are needed on macOS 233s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 233s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 233s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 233s raise 233s 233s # Reset the timeout for the recv() on the socket 233s read_timeout = timeout_obj.read_timeout 233s 233s if not conn.is_closed: 233s # In Python 3 socket.py will catch EAGAIN and return None when you 233s # try and read into the file pointer created by http.client, which 233s # instead raises a BadStatusLine exception. Instead of catching 233s # the exception and assuming all BadStatusLine exceptions are read 233s # timeouts, check for a zero timeout before making the request. 233s if read_timeout == 0: 233s raise ReadTimeoutError( 233s self, url, f"Read timed out. (read timeout={read_timeout})" 233s ) 233s conn.timeout = read_timeout 233s 233s # Receive the response from the server 233s try: 233s response = conn.getresponse() 233s except (BaseSSLError, OSError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 233s raise 233s 233s # Set properties that are used by the pooling layer. 233s response.retries = retries 233s response._connection = response_conn # type: ignore[attr-defined] 233s response._pool = self # type: ignore[attr-defined] 233s 233s log.debug( 233s '%s://%s:%s "%s %s %s" %s %s', 233s self.scheme, 233s self.host, 233s self.port, 233s method, 233s url, 233s > response.version_string, 233s response.status, 233s response.length_remaining, 233s ) 233s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 233s 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 233s ----------------------------- Captured stderr call ----------------------------- 233s 127.0.0.1 - - [18/Jan/2025 02:31:40] "GET /basic-auth/user/passwd HTTP/1.1" 401 0 233s _______________________________ test_post[http] ________________________________ 233s 233s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_post_http_0') 233s httpbin_both = 233s verify_pool_mgr = 233s 233s def test_post(tmpdir, httpbin_both, verify_pool_mgr): 233s """Ensure that we can post and cache the results""" 233s data = {"key1": "value1", "key2": "value2"} 233s url = httpbin_both.url + "/post" 233s with vcr.use_cassette(str(tmpdir.join("verify_pool_mgr.yaml"))): 233s > req1 = verify_pool_mgr.request("POST", url, data).data 233s 233s tests/integration/test_urllib3.py:94: 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:143: in request 233s return self.request_encode_body( 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:278: in request_encode_body 233s return self.urlopen(method, url, **extra_kw) 233s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 233s response = conn.urlopen(method, u.request_uri, **kw) 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 233s response = self._make_request( 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s 233s self = 233s conn = 233s method = 'POST', url = '/post', body = {'key1': 'value1', 'key2': 'value2'} 233s headers = HTTPHeaderDict({}) 233s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 233s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 233s chunked = False, response_conn = None, preload_content = True 233s decode_content = True, enforce_content_length = True 233s 233s def _make_request( 233s self, 233s conn: BaseHTTPConnection, 233s method: str, 233s url: str, 233s body: _TYPE_BODY | None = None, 233s headers: typing.Mapping[str, str] | None = None, 233s retries: Retry | None = None, 233s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 233s chunked: bool = False, 233s response_conn: BaseHTTPConnection | None = None, 233s preload_content: bool = True, 233s decode_content: bool = True, 233s enforce_content_length: bool = True, 233s ) -> BaseHTTPResponse: 233s """ 233s Perform a request on a given urllib connection object taken from our 233s pool. 233s 233s :param conn: 233s a connection from one of our connection pools 233s 233s :param method: 233s HTTP request method (such as GET, POST, PUT, etc.) 233s 233s :param url: 233s The URL to perform the request on. 233s 233s :param body: 233s Data to send in the request body, either :class:`str`, :class:`bytes`, 233s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 233s 233s :param headers: 233s Dictionary of custom headers to send, such as User-Agent, 233s If-None-Match, etc. If None, pool headers are used. If provided, 233s these headers completely replace any pool-specific headers. 233s 233s :param retries: 233s Configure the number of retries to allow before raising a 233s :class:`~urllib3.exceptions.MaxRetryError` exception. 233s 233s Pass ``None`` to retry until you receive a response. Pass a 233s :class:`~urllib3.util.retry.Retry` object for fine-grained control 233s over different types of retries. 233s Pass an integer number to retry connection errors that many times, 233s but no other types of errors. Pass zero to never retry. 233s 233s If ``False``, then retries are disabled and any exception is raised 233s immediately. Also, instead of raising a MaxRetryError on redirects, 233s the redirect response will be returned. 233s 233s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 233s 233s :param timeout: 233s If specified, overrides the default timeout for this one 233s request. It may be a float (in seconds) or an instance of 233s :class:`urllib3.util.Timeout`. 233s 233s :param chunked: 233s If True, urllib3 will send the body using chunked transfer 233s encoding. Otherwise, urllib3 will send the body using the standard 233s content-length form. Defaults to False. 233s 233s :param response_conn: 233s Set this to ``None`` if you will handle releasing the connection or 233s set the connection to have the response release it. 233s 233s :param preload_content: 233s If True, the response's body will be preloaded during construction. 233s 233s :param decode_content: 233s If True, will attempt to decode the body based on the 233s 'content-encoding' header. 233s 233s :param enforce_content_length: 233s Enforce content length checking. Body returned by server must match 233s value of Content-Length header, if present. Otherwise, raise error. 233s """ 233s self.num_requests += 1 233s 233s timeout_obj = self._get_timeout(timeout) 233s timeout_obj.start_connect() 233s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 233s 233s try: 233s # Trigger any extra validation we need to do. 233s try: 233s self._validate_conn(conn) 233s except (SocketTimeout, BaseSSLError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 233s raise 233s 233s # _validate_conn() starts the connection to an HTTPS proxy 233s # so we need to wrap errors with 'ProxyError' here too. 233s except ( 233s OSError, 233s NewConnectionError, 233s TimeoutError, 233s BaseSSLError, 233s CertificateError, 233s SSLError, 233s ) as e: 233s new_e: Exception = e 233s if isinstance(e, (BaseSSLError, CertificateError)): 233s new_e = SSLError(e) 233s # If the connection didn't successfully connect to it's proxy 233s # then there 233s if isinstance( 233s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 233s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 233s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 233s raise new_e 233s 233s # conn.request() calls http.client.*.request, not the method in 233s # urllib3.request. It also calls makefile (recv) on the socket. 233s try: 233s conn.request( 233s method, 233s url, 233s body=body, 233s headers=headers, 233s chunked=chunked, 233s preload_content=preload_content, 233s decode_content=decode_content, 233s enforce_content_length=enforce_content_length, 233s ) 233s 233s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 233s # legitimately able to close the connection after sending a valid response. 233s # With this behaviour, the received response is still readable. 233s except BrokenPipeError: 233s pass 233s except OSError as e: 233s # MacOS/Linux 233s # EPROTOTYPE and ECONNRESET are needed on macOS 233s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 233s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 233s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 233s raise 233s 233s # Reset the timeout for the recv() on the socket 233s read_timeout = timeout_obj.read_timeout 233s 233s if not conn.is_closed: 233s # In Python 3 socket.py will catch EAGAIN and return None when you 233s # try and read into the file pointer created by http.client, which 233s # instead raises a BadStatusLine exception. Instead of catching 233s # the exception and assuming all BadStatusLine exceptions are read 233s # timeouts, check for a zero timeout before making the request. 233s if read_timeout == 0: 233s raise ReadTimeoutError( 233s self, url, f"Read timed out. (read timeout={read_timeout})" 233s ) 233s conn.timeout = read_timeout 233s 233s # Receive the response from the server 233s try: 233s response = conn.getresponse() 233s except (BaseSSLError, OSError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 233s raise 233s 233s # Set properties that are used by the pooling layer. 233s response.retries = retries 233s response._connection = response_conn # type: ignore[attr-defined] 233s response._pool = self # type: ignore[attr-defined] 233s 233s log.debug( 233s '%s://%s:%s "%s %s %s" %s %s', 233s self.scheme, 233s self.host, 233s self.port, 233s method, 233s url, 233s > response.version_string, 233s response.status, 233s response.length_remaining, 233s ) 233s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 233s 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 233s ----------------------------- Captured stderr call ----------------------------- 233s 127.0.0.1 - - [18/Jan/2025 02:31:40] "POST /post HTTP/1.1" 501 159 233s _______________________________ test_gzip[http] ________________________________ 233s 233s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_gzip_http_0') 233s httpbin_both = 233s verify_pool_mgr = 233s 233s def test_gzip(tmpdir, httpbin_both, verify_pool_mgr): 233s """ 233s Ensure that requests (actually urllib3) is able to automatically decompress 233s the response body 233s """ 233s url = httpbin_both.url + "/gzip" 233s response = verify_pool_mgr.request("GET", url) 233s 233s with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))): 233s > response = verify_pool_mgr.request("GET", url) 233s 233s tests/integration/test_urllib3.py:140: 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 233s return self.request_encode_url( 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 233s return self.urlopen(method, url, **extra_kw) 233s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 233s response = conn.urlopen(method, u.request_uri, **kw) 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 233s response = self._make_request( 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s 233s self = 233s conn = 233s method = 'GET', url = '/gzip', body = None, headers = {} 233s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 233s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 233s chunked = False, response_conn = None, preload_content = True 233s decode_content = True, enforce_content_length = True 233s 233s def _make_request( 233s self, 233s conn: BaseHTTPConnection, 233s method: str, 233s url: str, 233s body: _TYPE_BODY | None = None, 233s headers: typing.Mapping[str, str] | None = None, 233s retries: Retry | None = None, 233s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 233s chunked: bool = False, 233s response_conn: BaseHTTPConnection | None = None, 233s preload_content: bool = True, 233s decode_content: bool = True, 233s enforce_content_length: bool = True, 233s ) -> BaseHTTPResponse: 233s """ 233s Perform a request on a given urllib connection object taken from our 233s pool. 233s 233s :param conn: 233s a connection from one of our connection pools 233s 233s :param method: 233s HTTP request method (such as GET, POST, PUT, etc.) 233s 233s :param url: 233s The URL to perform the request on. 233s 233s :param body: 233s Data to send in the request body, either :class:`str`, :class:`bytes`, 233s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 233s 233s :param headers: 233s Dictionary of custom headers to send, such as User-Agent, 233s If-None-Match, etc. If None, pool headers are used. If provided, 233s these headers completely replace any pool-specific headers. 233s 233s :param retries: 233s Configure the number of retries to allow before raising a 233s :class:`~urllib3.exceptions.MaxRetryError` exception. 233s 233s Pass ``None`` to retry until you receive a response. Pass a 233s :class:`~urllib3.util.retry.Retry` object for fine-grained control 233s over different types of retries. 233s Pass an integer number to retry connection errors that many times, 233s but no other types of errors. Pass zero to never retry. 233s 233s If ``False``, then retries are disabled and any exception is raised 233s immediately. Also, instead of raising a MaxRetryError on redirects, 233s the redirect response will be returned. 233s 233s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 233s 233s :param timeout: 233s If specified, overrides the default timeout for this one 233s request. It may be a float (in seconds) or an instance of 233s :class:`urllib3.util.Timeout`. 233s 233s :param chunked: 233s If True, urllib3 will send the body using chunked transfer 233s encoding. Otherwise, urllib3 will send the body using the standard 233s content-length form. Defaults to False. 233s 233s :param response_conn: 233s Set this to ``None`` if you will handle releasing the connection or 233s set the connection to have the response release it. 233s 233s :param preload_content: 233s If True, the response's body will be preloaded during construction. 233s 233s :param decode_content: 233s If True, will attempt to decode the body based on the 233s 'content-encoding' header. 233s 233s :param enforce_content_length: 233s Enforce content length checking. Body returned by server must match 233s value of Content-Length header, if present. Otherwise, raise error. 233s """ 233s self.num_requests += 1 233s 233s timeout_obj = self._get_timeout(timeout) 233s timeout_obj.start_connect() 233s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 233s 233s try: 233s # Trigger any extra validation we need to do. 233s try: 233s self._validate_conn(conn) 233s except (SocketTimeout, BaseSSLError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 233s raise 233s 233s # _validate_conn() starts the connection to an HTTPS proxy 233s # so we need to wrap errors with 'ProxyError' here too. 233s except ( 233s OSError, 233s NewConnectionError, 233s TimeoutError, 233s BaseSSLError, 233s CertificateError, 233s SSLError, 233s ) as e: 233s new_e: Exception = e 233s if isinstance(e, (BaseSSLError, CertificateError)): 233s new_e = SSLError(e) 233s # If the connection didn't successfully connect to it's proxy 233s # then there 233s if isinstance( 233s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 233s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 233s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 233s raise new_e 233s 233s # conn.request() calls http.client.*.request, not the method in 233s # urllib3.request. It also calls makefile (recv) on the socket. 233s try: 233s conn.request( 233s method, 233s url, 233s body=body, 233s headers=headers, 233s chunked=chunked, 233s preload_content=preload_content, 233s decode_content=decode_content, 233s enforce_content_length=enforce_content_length, 233s ) 233s 233s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 233s # legitimately able to close the connection after sending a valid response. 233s # With this behaviour, the received response is still readable. 233s except BrokenPipeError: 233s pass 233s except OSError as e: 233s # MacOS/Linux 233s # EPROTOTYPE and ECONNRESET are needed on macOS 233s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 233s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 233s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 233s raise 233s 233s # Reset the timeout for the recv() on the socket 233s read_timeout = timeout_obj.read_timeout 233s 233s if not conn.is_closed: 233s # In Python 3 socket.py will catch EAGAIN and return None when you 233s # try and read into the file pointer created by http.client, which 233s # instead raises a BadStatusLine exception. Instead of catching 233s # the exception and assuming all BadStatusLine exceptions are read 233s # timeouts, check for a zero timeout before making the request. 233s if read_timeout == 0: 233s raise ReadTimeoutError( 233s self, url, f"Read timed out. (read timeout={read_timeout})" 233s ) 233s conn.timeout = read_timeout 233s 233s # Receive the response from the server 233s try: 233s response = conn.getresponse() 233s except (BaseSSLError, OSError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 233s raise 233s 233s # Set properties that are used by the pooling layer. 233s response.retries = retries 233s response._connection = response_conn # type: ignore[attr-defined] 233s response._pool = self # type: ignore[attr-defined] 233s 233s log.debug( 233s '%s://%s:%s "%s %s %s" %s %s', 233s self.scheme, 233s self.host, 233s self.port, 233s method, 233s url, 233s > response.version_string, 233s response.status, 233s response.length_remaining, 233s ) 233s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 233s 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 233s ----------------------------- Captured stderr call ----------------------------- 233s 127.0.0.1 - - [18/Jan/2025 02:31:40] "GET /gzip HTTP/1.1" 200 165 233s 127.0.0.1 - - [18/Jan/2025 02:31:40] "GET /gzip HTTP/1.1" 200 165 233s ___________________________ test_status_code[https] ____________________________ 233s 233s httpbin_both = 233s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_status_code_https_0') 233s verify_pool_mgr = 233s 233s def test_status_code(httpbin_both, tmpdir, verify_pool_mgr): 233s """Ensure that we can read the status code""" 233s url = httpbin_both.url 233s with vcr.use_cassette(str(tmpdir.join("atts.yaml"))): 233s > status_code = verify_pool_mgr.request("GET", url).status 233s 233s tests/integration/test_urllib3.py:34: 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 233s return self.request_encode_url( 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 233s return self.urlopen(method, url, **extra_kw) 233s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 233s response = conn.urlopen(method, u.request_uri, **kw) 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 233s response = self._make_request( 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s 233s self = 233s conn = 233s method = 'GET', url = '/', body = None, headers = {} 233s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 233s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 233s chunked = False, response_conn = None, preload_content = True 233s decode_content = True, enforce_content_length = True 233s 233s def _make_request( 233s self, 233s conn: BaseHTTPConnection, 233s method: str, 233s url: str, 233s body: _TYPE_BODY | None = None, 233s headers: typing.Mapping[str, str] | None = None, 233s retries: Retry | None = None, 233s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 233s chunked: bool = False, 233s response_conn: BaseHTTPConnection | None = None, 233s preload_content: bool = True, 233s decode_content: bool = True, 233s enforce_content_length: bool = True, 233s ) -> BaseHTTPResponse: 233s """ 233s Perform a request on a given urllib connection object taken from our 233s pool. 233s 233s :param conn: 233s a connection from one of our connection pools 233s 233s :param method: 233s HTTP request method (such as GET, POST, PUT, etc.) 233s 233s :param url: 233s The URL to perform the request on. 233s 233s :param body: 233s Data to send in the request body, either :class:`str`, :class:`bytes`, 233s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 233s 233s :param headers: 233s Dictionary of custom headers to send, such as User-Agent, 233s If-None-Match, etc. If None, pool headers are used. If provided, 233s these headers completely replace any pool-specific headers. 233s 233s :param retries: 233s Configure the number of retries to allow before raising a 233s :class:`~urllib3.exceptions.MaxRetryError` exception. 233s 233s Pass ``None`` to retry until you receive a response. Pass a 233s :class:`~urllib3.util.retry.Retry` object for fine-grained control 233s over different types of retries. 233s Pass an integer number to retry connection errors that many times, 233s but no other types of errors. Pass zero to never retry. 233s 233s If ``False``, then retries are disabled and any exception is raised 233s immediately. Also, instead of raising a MaxRetryError on redirects, 233s the redirect response will be returned. 233s 233s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 233s 233s :param timeout: 233s If specified, overrides the default timeout for this one 233s request. It may be a float (in seconds) or an instance of 233s :class:`urllib3.util.Timeout`. 233s 233s :param chunked: 233s If True, urllib3 will send the body using chunked transfer 233s encoding. Otherwise, urllib3 will send the body using the standard 233s content-length form. Defaults to False. 233s 233s :param response_conn: 233s Set this to ``None`` if you will handle releasing the connection or 233s set the connection to have the response release it. 233s 233s :param preload_content: 233s If True, the response's body will be preloaded during construction. 233s 233s :param decode_content: 233s If True, will attempt to decode the body based on the 233s 'content-encoding' header. 233s 233s :param enforce_content_length: 233s Enforce content length checking. Body returned by server must match 233s value of Content-Length header, if present. Otherwise, raise error. 233s """ 233s self.num_requests += 1 233s 233s timeout_obj = self._get_timeout(timeout) 233s timeout_obj.start_connect() 233s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 233s 233s try: 233s # Trigger any extra validation we need to do. 233s try: 233s self._validate_conn(conn) 233s except (SocketTimeout, BaseSSLError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 233s raise 233s 233s # _validate_conn() starts the connection to an HTTPS proxy 233s # so we need to wrap errors with 'ProxyError' here too. 233s except ( 233s OSError, 233s NewConnectionError, 233s TimeoutError, 233s BaseSSLError, 233s CertificateError, 233s SSLError, 233s ) as e: 233s new_e: Exception = e 233s if isinstance(e, (BaseSSLError, CertificateError)): 233s new_e = SSLError(e) 233s # If the connection didn't successfully connect to it's proxy 233s # then there 233s if isinstance( 233s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 233s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 233s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 233s raise new_e 233s 233s # conn.request() calls http.client.*.request, not the method in 233s # urllib3.request. It also calls makefile (recv) on the socket. 233s try: 233s conn.request( 233s method, 233s url, 233s body=body, 233s headers=headers, 233s chunked=chunked, 233s preload_content=preload_content, 233s decode_content=decode_content, 233s enforce_content_length=enforce_content_length, 233s ) 233s 233s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 233s # legitimately able to close the connection after sending a valid response. 233s # With this behaviour, the received response is still readable. 233s except BrokenPipeError: 233s pass 233s except OSError as e: 233s # MacOS/Linux 233s # EPROTOTYPE and ECONNRESET are needed on macOS 233s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 233s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 233s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 233s raise 233s 233s # Reset the timeout for the recv() on the socket 233s read_timeout = timeout_obj.read_timeout 233s 233s if not conn.is_closed: 233s # In Python 3 socket.py will catch EAGAIN and return None when you 233s # try and read into the file pointer created by http.client, which 233s # instead raises a BadStatusLine exception. Instead of catching 233s # the exception and assuming all BadStatusLine exceptions are read 233s # timeouts, check for a zero timeout before making the request. 233s if read_timeout == 0: 233s raise ReadTimeoutError( 233s self, url, f"Read timed out. (read timeout={read_timeout})" 233s ) 233s conn.timeout = read_timeout 233s 233s # Receive the response from the server 233s try: 233s response = conn.getresponse() 233s except (BaseSSLError, OSError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 233s raise 233s 233s # Set properties that are used by the pooling layer. 233s response.retries = retries 233s response._connection = response_conn # type: ignore[attr-defined] 233s response._pool = self # type: ignore[attr-defined] 233s 233s log.debug( 233s '%s://%s:%s "%s %s %s" %s %s', 233s self.scheme, 233s self.host, 233s self.port, 233s method, 233s url, 233s > response.version_string, 233s response.status, 233s response.length_remaining, 233s ) 233s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 233s 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 233s ----------------------------- Captured stderr call ----------------------------- 233s 127.0.0.1 - - [18/Jan/2025 02:31:40] "GET / HTTP/1.1" 200 9358 233s _____________________________ test_headers[https] ______________________________ 233s 233s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_headers_https_0') 233s httpbin_both = 233s verify_pool_mgr = 233s 233s def test_headers(tmpdir, httpbin_both, verify_pool_mgr): 233s """Ensure that we can read the headers back""" 233s url = httpbin_both.url 233s with vcr.use_cassette(str(tmpdir.join("headers.yaml"))): 233s > headers = verify_pool_mgr.request("GET", url).headers 233s 233s tests/integration/test_urllib3.py:44: 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 233s return self.request_encode_url( 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 233s return self.urlopen(method, url, **extra_kw) 233s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 233s response = conn.urlopen(method, u.request_uri, **kw) 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 233s response = self._make_request( 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s 233s self = 233s conn = 233s method = 'GET', url = '/', body = None, headers = {} 233s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 233s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 233s chunked = False, response_conn = None, preload_content = True 233s decode_content = True, enforce_content_length = True 233s 233s def _make_request( 233s self, 233s conn: BaseHTTPConnection, 233s method: str, 233s url: str, 233s body: _TYPE_BODY | None = None, 233s headers: typing.Mapping[str, str] | None = None, 233s retries: Retry | None = None, 233s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 233s chunked: bool = False, 233s response_conn: BaseHTTPConnection | None = None, 233s preload_content: bool = True, 233s decode_content: bool = True, 233s enforce_content_length: bool = True, 233s ) -> BaseHTTPResponse: 233s """ 233s Perform a request on a given urllib connection object taken from our 233s pool. 233s 233s :param conn: 233s a connection from one of our connection pools 233s 233s :param method: 233s HTTP request method (such as GET, POST, PUT, etc.) 233s 233s :param url: 233s The URL to perform the request on. 233s 233s :param body: 233s Data to send in the request body, either :class:`str`, :class:`bytes`, 233s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 233s 233s :param headers: 233s Dictionary of custom headers to send, such as User-Agent, 233s If-None-Match, etc. If None, pool headers are used. If provided, 233s these headers completely replace any pool-specific headers. 233s 233s :param retries: 233s Configure the number of retries to allow before raising a 233s :class:`~urllib3.exceptions.MaxRetryError` exception. 233s 233s Pass ``None`` to retry until you receive a response. Pass a 233s :class:`~urllib3.util.retry.Retry` object for fine-grained control 233s over different types of retries. 233s Pass an integer number to retry connection errors that many times, 233s but no other types of errors. Pass zero to never retry. 233s 233s If ``False``, then retries are disabled and any exception is raised 233s immediately. Also, instead of raising a MaxRetryError on redirects, 233s the redirect response will be returned. 233s 233s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 233s 233s :param timeout: 233s If specified, overrides the default timeout for this one 233s request. It may be a float (in seconds) or an instance of 233s :class:`urllib3.util.Timeout`. 233s 233s :param chunked: 233s If True, urllib3 will send the body using chunked transfer 233s encoding. Otherwise, urllib3 will send the body using the standard 233s content-length form. Defaults to False. 233s 233s :param response_conn: 233s Set this to ``None`` if you will handle releasing the connection or 233s set the connection to have the response release it. 233s 233s :param preload_content: 233s If True, the response's body will be preloaded during construction. 233s 233s :param decode_content: 233s If True, will attempt to decode the body based on the 233s 'content-encoding' header. 233s 233s :param enforce_content_length: 233s Enforce content length checking. Body returned by server must match 233s value of Content-Length header, if present. Otherwise, raise error. 233s """ 233s self.num_requests += 1 233s 233s timeout_obj = self._get_timeout(timeout) 233s timeout_obj.start_connect() 233s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 233s 233s try: 233s # Trigger any extra validation we need to do. 233s try: 233s self._validate_conn(conn) 233s except (SocketTimeout, BaseSSLError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 233s raise 233s 233s # _validate_conn() starts the connection to an HTTPS proxy 233s # so we need to wrap errors with 'ProxyError' here too. 233s except ( 233s OSError, 233s NewConnectionError, 233s TimeoutError, 233s BaseSSLError, 233s CertificateError, 233s SSLError, 233s ) as e: 233s new_e: Exception = e 233s if isinstance(e, (BaseSSLError, CertificateError)): 233s new_e = SSLError(e) 233s # If the connection didn't successfully connect to it's proxy 233s # then there 233s if isinstance( 233s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 233s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 233s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 233s raise new_e 233s 233s # conn.request() calls http.client.*.request, not the method in 233s # urllib3.request. It also calls makefile (recv) on the socket. 233s try: 233s conn.request( 233s method, 233s url, 233s body=body, 233s headers=headers, 233s chunked=chunked, 233s preload_content=preload_content, 233s decode_content=decode_content, 233s enforce_content_length=enforce_content_length, 233s ) 233s 233s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 233s # legitimately able to close the connection after sending a valid response. 233s # With this behaviour, the received response is still readable. 233s except BrokenPipeError: 233s pass 233s except OSError as e: 233s # MacOS/Linux 233s # EPROTOTYPE and ECONNRESET are needed on macOS 233s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 233s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 233s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 233s raise 233s 233s # Reset the timeout for the recv() on the socket 233s read_timeout = timeout_obj.read_timeout 233s 233s if not conn.is_closed: 233s # In Python 3 socket.py will catch EAGAIN and return None when you 233s # try and read into the file pointer created by http.client, which 233s # instead raises a BadStatusLine exception. Instead of catching 233s # the exception and assuming all BadStatusLine exceptions are read 233s # timeouts, check for a zero timeout before making the request. 233s if read_timeout == 0: 233s raise ReadTimeoutError( 233s self, url, f"Read timed out. (read timeout={read_timeout})" 233s ) 233s conn.timeout = read_timeout 233s 233s # Receive the response from the server 233s try: 233s response = conn.getresponse() 233s except (BaseSSLError, OSError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 233s raise 233s 233s # Set properties that are used by the pooling layer. 233s response.retries = retries 233s response._connection = response_conn # type: ignore[attr-defined] 233s response._pool = self # type: ignore[attr-defined] 233s 233s log.debug( 233s '%s://%s:%s "%s %s %s" %s %s', 233s self.scheme, 233s self.host, 233s self.port, 233s method, 233s url, 233s > response.version_string, 233s response.status, 233s response.length_remaining, 233s ) 233s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 233s 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 233s ----------------------------- Captured stderr call ----------------------------- 233s 127.0.0.1 - - [18/Jan/2025 02:31:40] "GET / HTTP/1.1" 200 9358 233s _______________________________ test_body[https] _______________________________ 233s 233s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_body_https_0') 233s httpbin_both = 233s verify_pool_mgr = 233s 233s def test_body(tmpdir, httpbin_both, verify_pool_mgr): 233s """Ensure the responses are all identical enough""" 233s url = httpbin_both.url + "/bytes/1024" 233s with vcr.use_cassette(str(tmpdir.join("body.yaml"))): 233s > content = verify_pool_mgr.request("GET", url).data 233s 233s tests/integration/test_urllib3.py:55: 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 233s return self.request_encode_url( 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 233s return self.urlopen(method, url, **extra_kw) 233s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 233s response = conn.urlopen(method, u.request_uri, **kw) 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 233s response = self._make_request( 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s 233s self = 233s conn = 233s method = 'GET', url = '/bytes/1024', body = None, headers = {} 233s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 233s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 233s chunked = False, response_conn = None, preload_content = True 233s decode_content = True, enforce_content_length = True 233s 233s def _make_request( 233s self, 233s conn: BaseHTTPConnection, 233s method: str, 233s url: str, 233s body: _TYPE_BODY | None = None, 233s headers: typing.Mapping[str, str] | None = None, 233s retries: Retry | None = None, 233s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 233s chunked: bool = False, 233s response_conn: BaseHTTPConnection | None = None, 233s preload_content: bool = True, 233s decode_content: bool = True, 233s enforce_content_length: bool = True, 233s ) -> BaseHTTPResponse: 233s """ 233s Perform a request on a given urllib connection object taken from our 233s pool. 233s 233s :param conn: 233s a connection from one of our connection pools 233s 233s :param method: 233s HTTP request method (such as GET, POST, PUT, etc.) 233s 233s :param url: 233s The URL to perform the request on. 233s 233s :param body: 233s Data to send in the request body, either :class:`str`, :class:`bytes`, 233s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 233s 233s :param headers: 233s Dictionary of custom headers to send, such as User-Agent, 233s If-None-Match, etc. If None, pool headers are used. If provided, 233s these headers completely replace any pool-specific headers. 233s 233s :param retries: 233s Configure the number of retries to allow before raising a 233s :class:`~urllib3.exceptions.MaxRetryError` exception. 233s 233s Pass ``None`` to retry until you receive a response. Pass a 233s :class:`~urllib3.util.retry.Retry` object for fine-grained control 233s over different types of retries. 233s Pass an integer number to retry connection errors that many times, 233s but no other types of errors. Pass zero to never retry. 233s 233s If ``False``, then retries are disabled and any exception is raised 233s immediately. Also, instead of raising a MaxRetryError on redirects, 233s the redirect response will be returned. 233s 233s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 233s 233s :param timeout: 233s If specified, overrides the default timeout for this one 233s request. It may be a float (in seconds) or an instance of 233s :class:`urllib3.util.Timeout`. 233s 233s :param chunked: 233s If True, urllib3 will send the body using chunked transfer 233s encoding. Otherwise, urllib3 will send the body using the standard 233s content-length form. Defaults to False. 233s 233s :param response_conn: 233s Set this to ``None`` if you will handle releasing the connection or 233s set the connection to have the response release it. 233s 233s :param preload_content: 233s If True, the response's body will be preloaded during construction. 233s 233s :param decode_content: 233s If True, will attempt to decode the body based on the 233s 'content-encoding' header. 233s 233s :param enforce_content_length: 233s Enforce content length checking. Body returned by server must match 233s value of Content-Length header, if present. Otherwise, raise error. 233s """ 233s self.num_requests += 1 233s 233s timeout_obj = self._get_timeout(timeout) 233s timeout_obj.start_connect() 233s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 233s 233s try: 233s # Trigger any extra validation we need to do. 233s try: 233s self._validate_conn(conn) 233s except (SocketTimeout, BaseSSLError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 233s raise 233s 233s # _validate_conn() starts the connection to an HTTPS proxy 233s # so we need to wrap errors with 'ProxyError' here too. 233s except ( 233s OSError, 233s NewConnectionError, 233s TimeoutError, 233s BaseSSLError, 233s CertificateError, 233s SSLError, 233s ) as e: 233s new_e: Exception = e 233s if isinstance(e, (BaseSSLError, CertificateError)): 233s new_e = SSLError(e) 233s # If the connection didn't successfully connect to it's proxy 233s # then there 233s if isinstance( 233s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 233s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 233s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 233s raise new_e 233s 233s # conn.request() calls http.client.*.request, not the method in 233s # urllib3.request. It also calls makefile (recv) on the socket. 233s try: 233s conn.request( 233s method, 233s url, 233s body=body, 233s headers=headers, 233s chunked=chunked, 233s preload_content=preload_content, 233s decode_content=decode_content, 233s enforce_content_length=enforce_content_length, 233s ) 233s 233s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 233s # legitimately able to close the connection after sending a valid response. 233s # With this behaviour, the received response is still readable. 233s except BrokenPipeError: 233s pass 233s except OSError as e: 233s # MacOS/Linux 233s # EPROTOTYPE and ECONNRESET are needed on macOS 233s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 233s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 233s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 233s raise 233s 233s # Reset the timeout for the recv() on the socket 233s read_timeout = timeout_obj.read_timeout 233s 233s if not conn.is_closed: 233s # In Python 3 socket.py will catch EAGAIN and return None when you 233s # try and read into the file pointer created by http.client, which 233s # instead raises a BadStatusLine exception. Instead of catching 233s # the exception and assuming all BadStatusLine exceptions are read 233s # timeouts, check for a zero timeout before making the request. 233s if read_timeout == 0: 233s raise ReadTimeoutError( 233s self, url, f"Read timed out. (read timeout={read_timeout})" 233s ) 233s conn.timeout = read_timeout 233s 233s # Receive the response from the server 233s try: 233s response = conn.getresponse() 233s except (BaseSSLError, OSError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 233s raise 233s 233s # Set properties that are used by the pooling layer. 233s response.retries = retries 233s response._connection = response_conn # type: ignore[attr-defined] 233s response._pool = self # type: ignore[attr-defined] 233s 233s log.debug( 233s '%s://%s:%s "%s %s %s" %s %s', 233s self.scheme, 233s self.host, 233s self.port, 233s method, 233s url, 233s > response.version_string, 233s response.status, 233s response.length_remaining, 233s ) 233s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 233s 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 233s ----------------------------- Captured stderr call ----------------------------- 233s 127.0.0.1 - - [18/Jan/2025 02:31:40] "GET /bytes/1024 HTTP/1.1" 200 1024 233s _______________________________ test_auth[https] _______________________________ 233s 233s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_auth_https_0') 233s httpbin_both = 233s verify_pool_mgr = 233s 233s def test_auth(tmpdir, httpbin_both, verify_pool_mgr): 233s """Ensure that we can handle basic auth""" 233s auth = ("user", "passwd") 233s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 233s url = httpbin_both.url + "/basic-auth/user/passwd" 233s with vcr.use_cassette(str(tmpdir.join("auth.yaml"))): 233s > one = verify_pool_mgr.request("GET", url, headers=headers) 233s 233s tests/integration/test_urllib3.py:67: 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 233s return self.request_encode_url( 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 233s return self.urlopen(method, url, **extra_kw) 233s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 233s response = conn.urlopen(method, u.request_uri, **kw) 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 233s response = self._make_request( 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s 233s self = 233s conn = 233s method = 'GET', url = '/basic-auth/user/passwd', body = None 233s headers = {'authorization': 'Basic dXNlcjpwYXNzd2Q='} 233s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 233s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 233s chunked = False, response_conn = None, preload_content = True 233s decode_content = True, enforce_content_length = True 233s 233s def _make_request( 233s self, 233s conn: BaseHTTPConnection, 233s method: str, 233s url: str, 233s body: _TYPE_BODY | None = None, 233s headers: typing.Mapping[str, str] | None = None, 233s retries: Retry | None = None, 233s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 233s chunked: bool = False, 233s response_conn: BaseHTTPConnection | None = None, 233s preload_content: bool = True, 233s decode_content: bool = True, 233s enforce_content_length: bool = True, 233s ) -> BaseHTTPResponse: 233s """ 233s Perform a request on a given urllib connection object taken from our 233s pool. 233s 233s :param conn: 233s a connection from one of our connection pools 233s 233s :param method: 233s HTTP request method (such as GET, POST, PUT, etc.) 233s 233s :param url: 233s The URL to perform the request on. 233s 233s :param body: 233s Data to send in the request body, either :class:`str`, :class:`bytes`, 233s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 233s 233s :param headers: 233s Dictionary of custom headers to send, such as User-Agent, 233s If-None-Match, etc. If None, pool headers are used. If provided, 233s these headers completely replace any pool-specific headers. 233s 233s :param retries: 233s Configure the number of retries to allow before raising a 233s :class:`~urllib3.exceptions.MaxRetryError` exception. 233s 233s Pass ``None`` to retry until you receive a response. Pass a 233s :class:`~urllib3.util.retry.Retry` object for fine-grained control 233s over different types of retries. 233s Pass an integer number to retry connection errors that many times, 233s but no other types of errors. Pass zero to never retry. 233s 233s If ``False``, then retries are disabled and any exception is raised 233s immediately. Also, instead of raising a MaxRetryError on redirects, 233s the redirect response will be returned. 233s 233s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 233s 233s :param timeout: 233s If specified, overrides the default timeout for this one 233s request. It may be a float (in seconds) or an instance of 233s :class:`urllib3.util.Timeout`. 233s 233s :param chunked: 233s If True, urllib3 will send the body using chunked transfer 233s encoding. Otherwise, urllib3 will send the body using the standard 233s content-length form. Defaults to False. 233s 233s :param response_conn: 233s Set this to ``None`` if you will handle releasing the connection or 233s set the connection to have the response release it. 233s 233s :param preload_content: 233s If True, the response's body will be preloaded during construction. 233s 233s :param decode_content: 233s If True, will attempt to decode the body based on the 233s 'content-encoding' header. 233s 233s :param enforce_content_length: 233s Enforce content length checking. Body returned by server must match 233s value of Content-Length header, if present. Otherwise, raise error. 233s """ 233s self.num_requests += 1 233s 233s timeout_obj = self._get_timeout(timeout) 233s timeout_obj.start_connect() 233s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 233s 233s try: 233s # Trigger any extra validation we need to do. 233s try: 233s self._validate_conn(conn) 233s except (SocketTimeout, BaseSSLError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 233s raise 233s 233s # _validate_conn() starts the connection to an HTTPS proxy 233s # so we need to wrap errors with 'ProxyError' here too. 233s except ( 233s OSError, 233s NewConnectionError, 233s TimeoutError, 233s BaseSSLError, 233s CertificateError, 233s SSLError, 233s ) as e: 233s new_e: Exception = e 233s if isinstance(e, (BaseSSLError, CertificateError)): 233s new_e = SSLError(e) 233s # If the connection didn't successfully connect to it's proxy 233s # then there 233s if isinstance( 233s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 233s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 233s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 233s raise new_e 233s 233s # conn.request() calls http.client.*.request, not the method in 233s # urllib3.request. It also calls makefile (recv) on the socket. 233s try: 233s conn.request( 233s method, 233s url, 233s body=body, 233s headers=headers, 233s chunked=chunked, 233s preload_content=preload_content, 233s decode_content=decode_content, 233s enforce_content_length=enforce_content_length, 233s ) 233s 233s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 233s # legitimately able to close the connection after sending a valid response. 233s # With this behaviour, the received response is still readable. 233s except BrokenPipeError: 233s pass 233s except OSError as e: 233s # MacOS/Linux 233s # EPROTOTYPE and ECONNRESET are needed on macOS 233s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 233s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 233s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 233s raise 233s 233s # Reset the timeout for the recv() on the socket 233s read_timeout = timeout_obj.read_timeout 233s 233s if not conn.is_closed: 233s # In Python 3 socket.py will catch EAGAIN and return None when you 233s # try and read into the file pointer created by http.client, which 233s # instead raises a BadStatusLine exception. Instead of catching 233s # the exception and assuming all BadStatusLine exceptions are read 233s # timeouts, check for a zero timeout before making the request. 233s if read_timeout == 0: 233s raise ReadTimeoutError( 233s self, url, f"Read timed out. (read timeout={read_timeout})" 233s ) 233s conn.timeout = read_timeout 233s 233s # Receive the response from the server 233s try: 233s response = conn.getresponse() 233s except (BaseSSLError, OSError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 233s raise 233s 233s # Set properties that are used by the pooling layer. 233s response.retries = retries 233s response._connection = response_conn # type: ignore[attr-defined] 233s response._pool = self # type: ignore[attr-defined] 233s 233s log.debug( 233s '%s://%s:%s "%s %s %s" %s %s', 233s self.scheme, 233s self.host, 233s self.port, 233s method, 233s url, 233s > response.version_string, 233s response.status, 233s response.length_remaining, 233s ) 233s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 233s 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 233s ----------------------------- Captured stderr call ----------------------------- 233s 127.0.0.1 - - [18/Jan/2025 02:31:40] "GET /basic-auth/user/passwd HTTP/1.1" 200 46 233s ___________________________ test_auth_failed[https] ____________________________ 233s 233s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_auth_failed_https_0') 233s httpbin_both = 233s verify_pool_mgr = 233s 233s def test_auth_failed(tmpdir, httpbin_both, verify_pool_mgr): 233s """Ensure that we can save failed auth statuses""" 233s auth = ("user", "wrongwrongwrong") 233s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 233s url = httpbin_both.url + "/basic-auth/user/passwd" 233s with vcr.use_cassette(str(tmpdir.join("auth-failed.yaml"))) as cass: 233s # Ensure that this is empty to begin with 233s assert_cassette_empty(cass) 233s > one = verify_pool_mgr.request("GET", url, headers=headers) 233s 233s tests/integration/test_urllib3.py:83: 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 233s return self.request_encode_url( 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 233s return self.urlopen(method, url, **extra_kw) 233s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 233s response = conn.urlopen(method, u.request_uri, **kw) 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 233s response = self._make_request( 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s 233s self = 233s conn = 233s method = 'GET', url = '/basic-auth/user/passwd', body = None 233s headers = {'authorization': 'Basic dXNlcjp3cm9uZ3dyb25nd3Jvbmc='} 233s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 233s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 233s chunked = False, response_conn = None, preload_content = True 233s decode_content = True, enforce_content_length = True 233s 233s def _make_request( 233s self, 233s conn: BaseHTTPConnection, 233s method: str, 233s url: str, 233s body: _TYPE_BODY | None = None, 233s headers: typing.Mapping[str, str] | None = None, 233s retries: Retry | None = None, 233s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 233s chunked: bool = False, 233s response_conn: BaseHTTPConnection | None = None, 233s preload_content: bool = True, 233s decode_content: bool = True, 233s enforce_content_length: bool = True, 233s ) -> BaseHTTPResponse: 233s """ 233s Perform a request on a given urllib connection object taken from our 233s pool. 233s 233s :param conn: 233s a connection from one of our connection pools 233s 233s :param method: 233s HTTP request method (such as GET, POST, PUT, etc.) 233s 233s :param url: 233s The URL to perform the request on. 233s 233s :param body: 233s Data to send in the request body, either :class:`str`, :class:`bytes`, 233s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 233s 233s :param headers: 233s Dictionary of custom headers to send, such as User-Agent, 233s If-None-Match, etc. If None, pool headers are used. If provided, 233s these headers completely replace any pool-specific headers. 233s 233s :param retries: 233s Configure the number of retries to allow before raising a 233s :class:`~urllib3.exceptions.MaxRetryError` exception. 233s 233s Pass ``None`` to retry until you receive a response. Pass a 233s :class:`~urllib3.util.retry.Retry` object for fine-grained control 233s over different types of retries. 233s Pass an integer number to retry connection errors that many times, 233s but no other types of errors. Pass zero to never retry. 233s 233s If ``False``, then retries are disabled and any exception is raised 233s immediately. Also, instead of raising a MaxRetryError on redirects, 233s the redirect response will be returned. 233s 233s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 233s 233s :param timeout: 233s If specified, overrides the default timeout for this one 233s request. It may be a float (in seconds) or an instance of 233s :class:`urllib3.util.Timeout`. 233s 233s :param chunked: 233s If True, urllib3 will send the body using chunked transfer 233s encoding. Otherwise, urllib3 will send the body using the standard 233s content-length form. Defaults to False. 233s 233s :param response_conn: 233s Set this to ``None`` if you will handle releasing the connection or 233s set the connection to have the response release it. 233s 233s :param preload_content: 233s If True, the response's body will be preloaded during construction. 233s 233s :param decode_content: 233s If True, will attempt to decode the body based on the 233s 'content-encoding' header. 233s 233s :param enforce_content_length: 233s Enforce content length checking. Body returned by server must match 233s value of Content-Length header, if present. Otherwise, raise error. 233s """ 233s self.num_requests += 1 233s 233s timeout_obj = self._get_timeout(timeout) 233s timeout_obj.start_connect() 233s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 233s 233s try: 233s # Trigger any extra validation we need to do. 233s try: 233s self._validate_conn(conn) 233s except (SocketTimeout, BaseSSLError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 233s raise 233s 233s # _validate_conn() starts the connection to an HTTPS proxy 233s # so we need to wrap errors with 'ProxyError' here too. 233s except ( 233s OSError, 233s NewConnectionError, 233s TimeoutError, 233s BaseSSLError, 233s CertificateError, 233s SSLError, 233s ) as e: 233s new_e: Exception = e 233s if isinstance(e, (BaseSSLError, CertificateError)): 233s new_e = SSLError(e) 233s # If the connection didn't successfully connect to it's proxy 233s # then there 233s if isinstance( 233s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 233s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 233s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 233s raise new_e 233s 233s # conn.request() calls http.client.*.request, not the method in 233s # urllib3.request. It also calls makefile (recv) on the socket. 233s try: 233s conn.request( 233s method, 233s url, 233s body=body, 233s headers=headers, 233s chunked=chunked, 233s preload_content=preload_content, 233s decode_content=decode_content, 233s enforce_content_length=enforce_content_length, 233s ) 233s 233s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 233s # legitimately able to close the connection after sending a valid response. 233s # With this behaviour, the received response is still readable. 233s except BrokenPipeError: 233s pass 233s except OSError as e: 233s # MacOS/Linux 233s # EPROTOTYPE and ECONNRESET are needed on macOS 233s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 233s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 233s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 233s raise 233s 233s # Reset the timeout for the recv() on the socket 233s read_timeout = timeout_obj.read_timeout 233s 233s if not conn.is_closed: 233s # In Python 3 socket.py will catch EAGAIN and return None when you 233s # try and read into the file pointer created by http.client, which 233s # instead raises a BadStatusLine exception. Instead of catching 233s # the exception and assuming all BadStatusLine exceptions are read 233s # timeouts, check for a zero timeout before making the request. 233s if read_timeout == 0: 233s raise ReadTimeoutError( 233s self, url, f"Read timed out. (read timeout={read_timeout})" 233s ) 233s conn.timeout = read_timeout 233s 233s # Receive the response from the server 233s try: 233s response = conn.getresponse() 233s except (BaseSSLError, OSError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 233s raise 233s 233s # Set properties that are used by the pooling layer. 233s response.retries = retries 233s response._connection = response_conn # type: ignore[attr-defined] 233s response._pool = self # type: ignore[attr-defined] 233s 233s log.debug( 233s '%s://%s:%s "%s %s %s" %s %s', 233s self.scheme, 233s self.host, 233s self.port, 233s method, 233s url, 233s > response.version_string, 233s response.status, 233s response.length_remaining, 233s ) 233s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 233s 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 233s ----------------------------- Captured stderr call ----------------------------- 233s 127.0.0.1 - - [18/Jan/2025 02:31:40] "GET /basic-auth/user/passwd HTTP/1.1" 401 0 233s _______________________________ test_post[https] _______________________________ 233s 233s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_post_https_0') 233s httpbin_both = 233s verify_pool_mgr = 233s 233s def test_post(tmpdir, httpbin_both, verify_pool_mgr): 233s """Ensure that we can post and cache the results""" 233s data = {"key1": "value1", "key2": "value2"} 233s url = httpbin_both.url + "/post" 233s with vcr.use_cassette(str(tmpdir.join("verify_pool_mgr.yaml"))): 233s > req1 = verify_pool_mgr.request("POST", url, data).data 233s 233s tests/integration/test_urllib3.py:94: 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:143: in request 233s return self.request_encode_body( 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:278: in request_encode_body 233s return self.urlopen(method, url, **extra_kw) 233s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 233s response = conn.urlopen(method, u.request_uri, **kw) 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 233s response = self._make_request( 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s 233s self = 233s conn = 233s method = 'POST', url = '/post', body = {'key1': 'value1', 'key2': 'value2'} 233s headers = HTTPHeaderDict({}) 233s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 233s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 233s chunked = False, response_conn = None, preload_content = True 233s decode_content = True, enforce_content_length = True 233s 233s def _make_request( 233s self, 233s conn: BaseHTTPConnection, 233s method: str, 233s url: str, 233s body: _TYPE_BODY | None = None, 233s headers: typing.Mapping[str, str] | None = None, 233s retries: Retry | None = None, 233s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 233s chunked: bool = False, 233s response_conn: BaseHTTPConnection | None = None, 233s preload_content: bool = True, 233s decode_content: bool = True, 233s enforce_content_length: bool = True, 233s ) -> BaseHTTPResponse: 233s """ 233s Perform a request on a given urllib connection object taken from our 233s pool. 233s 233s :param conn: 233s a connection from one of our connection pools 233s 233s :param method: 233s HTTP request method (such as GET, POST, PUT, etc.) 233s 233s :param url: 233s The URL to perform the request on. 233s 233s :param body: 233s Data to send in the request body, either :class:`str`, :class:`bytes`, 233s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 233s 233s :param headers: 233s Dictionary of custom headers to send, such as User-Agent, 233s If-None-Match, etc. If None, pool headers are used. If provided, 233s these headers completely replace any pool-specific headers. 233s 233s :param retries: 233s Configure the number of retries to allow before raising a 233s :class:`~urllib3.exceptions.MaxRetryError` exception. 233s 233s Pass ``None`` to retry until you receive a response. Pass a 233s :class:`~urllib3.util.retry.Retry` object for fine-grained control 233s over different types of retries. 233s Pass an integer number to retry connection errors that many times, 233s but no other types of errors. Pass zero to never retry. 233s 233s If ``False``, then retries are disabled and any exception is raised 233s immediately. Also, instead of raising a MaxRetryError on redirects, 233s the redirect response will be returned. 233s 233s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 233s 233s :param timeout: 233s If specified, overrides the default timeout for this one 233s request. It may be a float (in seconds) or an instance of 233s :class:`urllib3.util.Timeout`. 233s 233s :param chunked: 233s If True, urllib3 will send the body using chunked transfer 233s encoding. Otherwise, urllib3 will send the body using the standard 233s content-length form. Defaults to False. 233s 233s :param response_conn: 233s Set this to ``None`` if you will handle releasing the connection or 233s set the connection to have the response release it. 233s 233s :param preload_content: 233s If True, the response's body will be preloaded during construction. 233s 233s :param decode_content: 233s If True, will attempt to decode the body based on the 233s 'content-encoding' header. 233s 233s :param enforce_content_length: 233s Enforce content length checking. Body returned by server must match 233s value of Content-Length header, if present. Otherwise, raise error. 233s """ 233s self.num_requests += 1 233s 233s timeout_obj = self._get_timeout(timeout) 233s timeout_obj.start_connect() 233s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 233s 233s try: 233s # Trigger any extra validation we need to do. 233s try: 233s self._validate_conn(conn) 233s except (SocketTimeout, BaseSSLError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 233s raise 233s 233s # _validate_conn() starts the connection to an HTTPS proxy 233s # so we need to wrap errors with 'ProxyError' here too. 233s except ( 233s OSError, 233s NewConnectionError, 233s TimeoutError, 233s BaseSSLError, 233s CertificateError, 233s SSLError, 233s ) as e: 233s new_e: Exception = e 233s if isinstance(e, (BaseSSLError, CertificateError)): 233s new_e = SSLError(e) 233s # If the connection didn't successfully connect to it's proxy 233s # then there 233s if isinstance( 233s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 233s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 233s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 233s raise new_e 233s 233s # conn.request() calls http.client.*.request, not the method in 233s # urllib3.request. It also calls makefile (recv) on the socket. 233s try: 233s conn.request( 233s method, 233s url, 233s body=body, 233s headers=headers, 233s chunked=chunked, 233s preload_content=preload_content, 233s decode_content=decode_content, 233s enforce_content_length=enforce_content_length, 233s ) 233s 233s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 233s # legitimately able to close the connection after sending a valid response. 233s # With this behaviour, the received response is still readable. 233s except BrokenPipeError: 233s pass 233s except OSError as e: 233s # MacOS/Linux 233s # EPROTOTYPE and ECONNRESET are needed on macOS 233s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 233s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 233s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 233s raise 233s 233s # Reset the timeout for the recv() on the socket 233s read_timeout = timeout_obj.read_timeout 233s 233s if not conn.is_closed: 233s # In Python 3 socket.py will catch EAGAIN and return None when you 233s # try and read into the file pointer created by http.client, which 233s # instead raises a BadStatusLine exception. Instead of catching 233s # the exception and assuming all BadStatusLine exceptions are read 233s # timeouts, check for a zero timeout before making the request. 233s if read_timeout == 0: 233s raise ReadTimeoutError( 233s self, url, f"Read timed out. (read timeout={read_timeout})" 233s ) 233s conn.timeout = read_timeout 233s 233s # Receive the response from the server 233s try: 233s response = conn.getresponse() 233s except (BaseSSLError, OSError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 233s raise 233s 233s # Set properties that are used by the pooling layer. 233s response.retries = retries 233s response._connection = response_conn # type: ignore[attr-defined] 233s response._pool = self # type: ignore[attr-defined] 233s 233s log.debug( 233s '%s://%s:%s "%s %s %s" %s %s', 233s self.scheme, 233s self.host, 233s self.port, 233s method, 233s url, 233s > response.version_string, 233s response.status, 233s response.length_remaining, 233s ) 233s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 233s 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 233s ----------------------------- Captured stderr call ----------------------------- 233s 127.0.0.1 - - [18/Jan/2025 02:31:40] "POST /post HTTP/1.1" 501 159 233s _______________________________ test_gzip[https] _______________________________ 233s 233s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_gzip_https_0') 233s httpbin_both = 233s verify_pool_mgr = 233s 233s def test_gzip(tmpdir, httpbin_both, verify_pool_mgr): 233s """ 233s Ensure that requests (actually urllib3) is able to automatically decompress 233s the response body 233s """ 233s url = httpbin_both.url + "/gzip" 233s response = verify_pool_mgr.request("GET", url) 233s 233s with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))): 233s > response = verify_pool_mgr.request("GET", url) 233s 233s tests/integration/test_urllib3.py:140: 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 233s return self.request_encode_url( 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 233s return self.urlopen(method, url, **extra_kw) 233s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 233s response = conn.urlopen(method, u.request_uri, **kw) 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 233s response = self._make_request( 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s 233s self = 233s conn = 233s method = 'GET', url = '/gzip', body = None, headers = {} 233s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 233s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 233s chunked = False, response_conn = None, preload_content = True 233s decode_content = True, enforce_content_length = True 233s 233s def _make_request( 233s self, 233s conn: BaseHTTPConnection, 233s method: str, 233s url: str, 233s body: _TYPE_BODY | None = None, 233s headers: typing.Mapping[str, str] | None = None, 233s retries: Retry | None = None, 233s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 233s chunked: bool = False, 233s response_conn: BaseHTTPConnection | None = None, 233s preload_content: bool = True, 233s decode_content: bool = True, 233s enforce_content_length: bool = True, 233s ) -> BaseHTTPResponse: 233s """ 233s Perform a request on a given urllib connection object taken from our 233s pool. 233s 233s :param conn: 233s a connection from one of our connection pools 233s 233s :param method: 233s HTTP request method (such as GET, POST, PUT, etc.) 233s 233s :param url: 233s The URL to perform the request on. 233s 233s :param body: 233s Data to send in the request body, either :class:`str`, :class:`bytes`, 233s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 233s 233s :param headers: 233s Dictionary of custom headers to send, such as User-Agent, 233s If-None-Match, etc. If None, pool headers are used. If provided, 233s these headers completely replace any pool-specific headers. 233s 233s :param retries: 233s Configure the number of retries to allow before raising a 233s :class:`~urllib3.exceptions.MaxRetryError` exception. 233s 233s Pass ``None`` to retry until you receive a response. Pass a 233s :class:`~urllib3.util.retry.Retry` object for fine-grained control 233s over different types of retries. 233s Pass an integer number to retry connection errors that many times, 233s but no other types of errors. Pass zero to never retry. 233s 233s If ``False``, then retries are disabled and any exception is raised 233s immediately. Also, instead of raising a MaxRetryError on redirects, 233s the redirect response will be returned. 233s 233s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 233s 233s :param timeout: 233s If specified, overrides the default timeout for this one 233s request. It may be a float (in seconds) or an instance of 233s :class:`urllib3.util.Timeout`. 233s 233s :param chunked: 233s If True, urllib3 will send the body using chunked transfer 233s encoding. Otherwise, urllib3 will send the body using the standard 233s content-length form. Defaults to False. 233s 233s :param response_conn: 233s Set this to ``None`` if you will handle releasing the connection or 233s set the connection to have the response release it. 233s 233s :param preload_content: 233s If True, the response's body will be preloaded during construction. 233s 233s :param decode_content: 233s If True, will attempt to decode the body based on the 233s 'content-encoding' header. 233s 233s :param enforce_content_length: 233s Enforce content length checking. Body returned by server must match 233s value of Content-Length header, if present. Otherwise, raise error. 233s """ 233s self.num_requests += 1 233s 233s timeout_obj = self._get_timeout(timeout) 233s timeout_obj.start_connect() 233s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 233s 233s try: 233s # Trigger any extra validation we need to do. 233s try: 233s self._validate_conn(conn) 233s except (SocketTimeout, BaseSSLError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 233s raise 233s 233s # _validate_conn() starts the connection to an HTTPS proxy 233s # so we need to wrap errors with 'ProxyError' here too. 233s except ( 233s OSError, 233s NewConnectionError, 233s TimeoutError, 233s BaseSSLError, 233s CertificateError, 233s SSLError, 233s ) as e: 233s new_e: Exception = e 233s if isinstance(e, (BaseSSLError, CertificateError)): 233s new_e = SSLError(e) 233s # If the connection didn't successfully connect to it's proxy 233s # then there 233s if isinstance( 233s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 233s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 233s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 233s raise new_e 233s 233s # conn.request() calls http.client.*.request, not the method in 233s # urllib3.request. It also calls makefile (recv) on the socket. 233s try: 233s conn.request( 233s method, 233s url, 233s body=body, 233s headers=headers, 233s chunked=chunked, 233s preload_content=preload_content, 233s decode_content=decode_content, 233s enforce_content_length=enforce_content_length, 233s ) 233s 233s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 233s # legitimately able to close the connection after sending a valid response. 233s # With this behaviour, the received response is still readable. 233s except BrokenPipeError: 233s pass 233s except OSError as e: 233s # MacOS/Linux 233s # EPROTOTYPE and ECONNRESET are needed on macOS 233s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 233s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 233s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 233s raise 233s 233s # Reset the timeout for the recv() on the socket 233s read_timeout = timeout_obj.read_timeout 233s 233s if not conn.is_closed: 233s # In Python 3 socket.py will catch EAGAIN and return None when you 233s # try and read into the file pointer created by http.client, which 233s # instead raises a BadStatusLine exception. Instead of catching 233s # the exception and assuming all BadStatusLine exceptions are read 233s # timeouts, check for a zero timeout before making the request. 233s if read_timeout == 0: 233s raise ReadTimeoutError( 233s self, url, f"Read timed out. (read timeout={read_timeout})" 233s ) 233s conn.timeout = read_timeout 233s 233s # Receive the response from the server 233s try: 233s response = conn.getresponse() 233s except (BaseSSLError, OSError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 233s raise 233s 233s # Set properties that are used by the pooling layer. 233s response.retries = retries 233s response._connection = response_conn # type: ignore[attr-defined] 233s response._pool = self # type: ignore[attr-defined] 233s 233s log.debug( 233s '%s://%s:%s "%s %s %s" %s %s', 233s self.scheme, 233s self.host, 233s self.port, 233s method, 233s url, 233s > response.version_string, 233s response.status, 233s response.length_remaining, 233s ) 233s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 233s 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 233s ----------------------------- Captured stderr call ----------------------------- 233s 127.0.0.1 - - [18/Jan/2025 02:31:40] "GET /gzip HTTP/1.1" 200 165 233s 127.0.0.1 - - [18/Jan/2025 02:31:40] "GET /gzip HTTP/1.1" 200 165 233s ________________________________ test_use_proxy ________________________________ 233s 233s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_use_proxy0') 233s httpbin = 233s proxy_server = 'http://0.0.0.0:57911' 233s 233s def test_use_proxy(tmpdir, httpbin, proxy_server): 233s """Ensure that it works with a proxy.""" 233s with vcr.use_cassette(str(tmpdir.join("proxy.yaml"))): 233s > response = requests.get(httpbin.url, proxies={"http": proxy_server}) 233s 233s tests/integration/test_proxy.py:53: 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s /usr/lib/python3/dist-packages/requests/api.py:73: in get 233s return request("get", url, params=params, **kwargs) 233s /usr/lib/python3/dist-packages/requests/api.py:59: in request 233s return session.request(method=method, url=url, **kwargs) 233s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 233s resp = self.send(prep, **send_kwargs) 233s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 233s r = adapter.send(request, **kwargs) 233s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 233s resp = conn.urlopen( 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 233s response = self._make_request( 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s 233s self = 233s conn = 233s method = 'GET', url = 'http://127.0.0.1:41053/', body = None 233s headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 233s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 233s timeout = Timeout(connect=None, read=None, total=None), chunked = False 233s response_conn = 233s preload_content = False, decode_content = False, enforce_content_length = True 233s 233s def _make_request( 233s self, 233s conn: BaseHTTPConnection, 233s method: str, 233s url: str, 233s body: _TYPE_BODY | None = None, 233s headers: typing.Mapping[str, str] | None = None, 233s retries: Retry | None = None, 233s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 233s chunked: bool = False, 233s response_conn: BaseHTTPConnection | None = None, 233s preload_content: bool = True, 233s decode_content: bool = True, 233s enforce_content_length: bool = True, 233s ) -> BaseHTTPResponse: 233s """ 233s Perform a request on a given urllib connection object taken from our 233s pool. 233s 233s :param conn: 233s a connection from one of our connection pools 233s 233s :param method: 233s HTTP request method (such as GET, POST, PUT, etc.) 233s 233s :param url: 233s The URL to perform the request on. 233s 233s :param body: 233s Data to send in the request body, either :class:`str`, :class:`bytes`, 233s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 233s 233s :param headers: 233s Dictionary of custom headers to send, such as User-Agent, 233s If-None-Match, etc. If None, pool headers are used. If provided, 233s these headers completely replace any pool-specific headers. 233s 233s :param retries: 233s Configure the number of retries to allow before raising a 233s :class:`~urllib3.exceptions.MaxRetryError` exception. 233s 233s Pass ``None`` to retry until you receive a response. Pass a 233s :class:`~urllib3.util.retry.Retry` object for fine-grained control 233s over different types of retries. 233s Pass an integer number to retry connection errors that many times, 233s but no other types of errors. Pass zero to never retry. 233s 233s If ``False``, then retries are disabled and any exception is raised 233s immediately. Also, instead of raising a MaxRetryError on redirects, 233s the redirect response will be returned. 233s 233s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 233s 233s :param timeout: 233s If specified, overrides the default timeout for this one 233s request. It may be a float (in seconds) or an instance of 233s :class:`urllib3.util.Timeout`. 233s 233s :param chunked: 233s If True, urllib3 will send the body using chunked transfer 233s encoding. Otherwise, urllib3 will send the body using the standard 233s content-length form. Defaults to False. 233s 233s :param response_conn: 233s Set this to ``None`` if you will handle releasing the connection or 233s set the connection to have the response release it. 233s 233s :param preload_content: 233s If True, the response's body will be preloaded during construction. 233s 233s :param decode_content: 233s If True, will attempt to decode the body based on the 233s 'content-encoding' header. 233s 233s :param enforce_content_length: 233s Enforce content length checking. Body returned by server must match 233s value of Content-Length header, if present. Otherwise, raise error. 233s """ 233s self.num_requests += 1 233s 233s timeout_obj = self._get_timeout(timeout) 233s timeout_obj.start_connect() 233s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 233s 233s try: 233s # Trigger any extra validation we need to do. 233s try: 233s self._validate_conn(conn) 233s except (SocketTimeout, BaseSSLError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 233s raise 233s 233s # _validate_conn() starts the connection to an HTTPS proxy 233s # so we need to wrap errors with 'ProxyError' here too. 233s except ( 233s OSError, 233s NewConnectionError, 233s TimeoutError, 233s BaseSSLError, 233s CertificateError, 233s SSLError, 233s ) as e: 233s new_e: Exception = e 233s if isinstance(e, (BaseSSLError, CertificateError)): 233s new_e = SSLError(e) 233s # If the connection didn't successfully connect to it's proxy 233s # then there 233s if isinstance( 233s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 233s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 233s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 233s raise new_e 233s 233s # conn.request() calls http.client.*.request, not the method in 233s # urllib3.request. It also calls makefile (recv) on the socket. 233s try: 233s conn.request( 233s method, 233s url, 233s body=body, 233s headers=headers, 233s chunked=chunked, 233s preload_content=preload_content, 233s decode_content=decode_content, 233s enforce_content_length=enforce_content_length, 233s ) 233s 233s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 233s # legitimately able to close the connection after sending a valid response. 233s # With this behaviour, the received response is still readable. 233s except BrokenPipeError: 233s pass 233s except OSError as e: 233s # MacOS/Linux 233s # EPROTOTYPE and ECONNRESET are needed on macOS 233s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 233s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 233s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 233s raise 233s 233s # Reset the timeout for the recv() on the socket 233s read_timeout = timeout_obj.read_timeout 233s 233s if not conn.is_closed: 233s # In Python 3 socket.py will catch EAGAIN and return None when you 233s # try and read into the file pointer created by http.client, which 233s # instead raises a BadStatusLine exception. Instead of catching 233s # the exception and assuming all BadStatusLine exceptions are read 233s # timeouts, check for a zero timeout before making the request. 233s if read_timeout == 0: 233s raise ReadTimeoutError( 233s self, url, f"Read timed out. (read timeout={read_timeout})" 233s ) 233s conn.timeout = read_timeout 233s 233s # Receive the response from the server 233s try: 233s response = conn.getresponse() 233s except (BaseSSLError, OSError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 233s raise 233s 233s # Set properties that are used by the pooling layer. 233s response.retries = retries 233s response._connection = response_conn # type: ignore[attr-defined] 233s response._pool = self # type: ignore[attr-defined] 233s 233s log.debug( 233s '%s://%s:%s "%s %s %s" %s %s', 233s self.scheme, 233s self.host, 233s self.port, 233s method, 233s url, 233s > response.version_string, 233s response.status, 233s response.length_remaining, 233s ) 233s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 233s 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 233s ----------------------------- Captured stderr call ----------------------------- 233s 127.0.0.1 - - [18/Jan/2025 02:31:41] "GET / HTTP/1.1" 200 9358 233s 127.0.0.1 - - [18/Jan/2025 02:31:41] "GET http://127.0.0.1:41053/ HTTP/1.1" 200 - 233s ______________________________ test_cross_scheme _______________________________ 233s 233s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_cross_scheme2') 233s httpbin = 233s httpbin_secure = 233s verify_pool_mgr = 233s 233s def test_cross_scheme(tmpdir, httpbin, httpbin_secure, verify_pool_mgr): 233s """Ensure that requests between schemes are treated separately""" 233s # First fetch a url under http, and then again under https and then 233s # ensure that we haven't served anything out of cache, and we have two 233s # requests / response pairs in the cassette 233s with vcr.use_cassette(str(tmpdir.join("cross_scheme.yaml"))) as cass: 233s > verify_pool_mgr.request("GET", httpbin_secure.url) 233s 233s tests/integration/test_urllib3.py:125: 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 233s return self.request_encode_url( 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 233s return self.urlopen(method, url, **extra_kw) 233s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 233s response = conn.urlopen(method, u.request_uri, **kw) 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 233s response = self._make_request( 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s 233s self = 233s conn = 233s method = 'GET', url = '/', body = None, headers = {} 233s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 233s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 233s chunked = False, response_conn = None, preload_content = True 233s decode_content = True, enforce_content_length = True 233s 233s def _make_request( 233s self, 233s conn: BaseHTTPConnection, 233s method: str, 233s url: str, 233s body: _TYPE_BODY | None = None, 233s headers: typing.Mapping[str, str] | None = None, 233s retries: Retry | None = None, 233s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 233s chunked: bool = False, 233s response_conn: BaseHTTPConnection | None = None, 233s preload_content: bool = True, 233s decode_content: bool = True, 233s enforce_content_length: bool = True, 233s ) -> BaseHTTPResponse: 233s """ 233s Perform a request on a given urllib connection object taken from our 233s pool. 233s 233s :param conn: 233s a connection from one of our connection pools 233s 233s :param method: 233s HTTP request method (such as GET, POST, PUT, etc.) 233s 233s :param url: 233s The URL to perform the request on. 233s 233s :param body: 233s Data to send in the request body, either :class:`str`, :class:`bytes`, 233s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 233s 233s :param headers: 233s Dictionary of custom headers to send, such as User-Agent, 233s If-None-Match, etc. If None, pool headers are used. If provided, 233s these headers completely replace any pool-specific headers. 233s 233s :param retries: 233s Configure the number of retries to allow before raising a 233s :class:`~urllib3.exceptions.MaxRetryError` exception. 233s 233s Pass ``None`` to retry until you receive a response. Pass a 233s :class:`~urllib3.util.retry.Retry` object for fine-grained control 233s over different types of retries. 233s Pass an integer number to retry connection errors that many times, 233s but no other types of errors. Pass zero to never retry. 233s 233s If ``False``, then retries are disabled and any exception is raised 233s immediately. Also, instead of raising a MaxRetryError on redirects, 233s the redirect response will be returned. 233s 233s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 233s 233s :param timeout: 233s If specified, overrides the default timeout for this one 233s request. It may be a float (in seconds) or an instance of 233s :class:`urllib3.util.Timeout`. 233s 233s :param chunked: 233s If True, urllib3 will send the body using chunked transfer 233s encoding. Otherwise, urllib3 will send the body using the standard 233s content-length form. Defaults to False. 233s 233s :param response_conn: 233s Set this to ``None`` if you will handle releasing the connection or 233s set the connection to have the response release it. 233s 233s :param preload_content: 233s If True, the response's body will be preloaded during construction. 233s 233s :param decode_content: 233s If True, will attempt to decode the body based on the 233s 'content-encoding' header. 233s 233s :param enforce_content_length: 233s Enforce content length checking. Body returned by server must match 233s value of Content-Length header, if present. Otherwise, raise error. 233s """ 233s self.num_requests += 1 233s 233s timeout_obj = self._get_timeout(timeout) 233s timeout_obj.start_connect() 233s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 233s 233s try: 233s # Trigger any extra validation we need to do. 233s try: 233s self._validate_conn(conn) 233s except (SocketTimeout, BaseSSLError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 233s raise 233s 233s # _validate_conn() starts the connection to an HTTPS proxy 233s # so we need to wrap errors with 'ProxyError' here too. 233s except ( 233s OSError, 233s NewConnectionError, 233s TimeoutError, 233s BaseSSLError, 233s CertificateError, 233s SSLError, 233s ) as e: 233s new_e: Exception = e 233s if isinstance(e, (BaseSSLError, CertificateError)): 233s new_e = SSLError(e) 233s # If the connection didn't successfully connect to it's proxy 233s # then there 233s if isinstance( 233s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 233s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 233s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 233s raise new_e 233s 233s # conn.request() calls http.client.*.request, not the method in 233s # urllib3.request. It also calls makefile (recv) on the socket. 233s try: 233s conn.request( 233s method, 233s url, 233s body=body, 233s headers=headers, 233s chunked=chunked, 233s preload_content=preload_content, 233s decode_content=decode_content, 233s enforce_content_length=enforce_content_length, 233s ) 233s 233s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 233s # legitimately able to close the connection after sending a valid response. 233s # With this behaviour, the received response is still readable. 233s except BrokenPipeError: 233s pass 233s except OSError as e: 233s # MacOS/Linux 233s # EPROTOTYPE and ECONNRESET are needed on macOS 233s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 233s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 233s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 233s raise 233s 233s # Reset the timeout for the recv() on the socket 233s read_timeout = timeout_obj.read_timeout 233s 233s if not conn.is_closed: 233s # In Python 3 socket.py will catch EAGAIN and return None when you 233s # try and read into the file pointer created by http.client, which 233s # instead raises a BadStatusLine exception. Instead of catching 233s # the exception and assuming all BadStatusLine exceptions are read 233s # timeouts, check for a zero timeout before making the request. 233s if read_timeout == 0: 233s raise ReadTimeoutError( 233s self, url, f"Read timed out. (read timeout={read_timeout})" 233s ) 233s conn.timeout = read_timeout 233s 233s # Receive the response from the server 233s try: 233s response = conn.getresponse() 233s except (BaseSSLError, OSError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 233s raise 233s 233s # Set properties that are used by the pooling layer. 233s response.retries = retries 233s response._connection = response_conn # type: ignore[attr-defined] 233s response._pool = self # type: ignore[attr-defined] 233s 233s log.debug( 233s '%s://%s:%s "%s %s %s" %s %s', 233s self.scheme, 233s self.host, 233s self.port, 233s method, 233s url, 233s > response.version_string, 233s response.status, 233s response.length_remaining, 233s ) 233s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 233s 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 233s ----------------------------- Captured stderr call ----------------------------- 233s 127.0.0.1 - - [18/Jan/2025 02:31:41] "GET / HTTP/1.1" 200 9358 233s ___________________ test_https_with_cert_validation_disabled ___________________ 233s 233s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_https_with_cert_validatio0') 233s httpbin_secure = 233s pool_mgr = 233s 233s def test_https_with_cert_validation_disabled(tmpdir, httpbin_secure, pool_mgr): 233s with vcr.use_cassette(str(tmpdir.join("cert_validation_disabled.yaml"))): 233s > pool_mgr.request("GET", httpbin_secure.url) 233s 233s tests/integration/test_urllib3.py:149: 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 233s return self.request_encode_url( 233s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 233s return self.urlopen(method, url, **extra_kw) 233s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 233s response = conn.urlopen(method, u.request_uri, **kw) 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 233s response = self._make_request( 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s 233s self = 233s conn = 233s method = 'GET', url = '/', body = None, headers = {} 233s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 233s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 233s chunked = False, response_conn = None, preload_content = True 233s decode_content = True, enforce_content_length = True 233s 233s def _make_request( 233s self, 233s conn: BaseHTTPConnection, 233s method: str, 233s url: str, 233s body: _TYPE_BODY | None = None, 233s headers: typing.Mapping[str, str] | None = None, 233s retries: Retry | None = None, 233s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 233s chunked: bool = False, 233s response_conn: BaseHTTPConnection | None = None, 233s preload_content: bool = True, 233s decode_content: bool = True, 233s enforce_content_length: bool = True, 233s ) -> BaseHTTPResponse: 233s """ 233s Perform a request on a given urllib connection object taken from our 233s pool. 233s 233s :param conn: 233s a connection from one of our connection pools 233s 233s :param method: 233s HTTP request method (such as GET, POST, PUT, etc.) 233s 233s :param url: 233s The URL to perform the request on. 233s 233s :param body: 233s Data to send in the request body, either :class:`str`, :class:`bytes`, 233s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 233s 233s :param headers: 233s Dictionary of custom headers to send, such as User-Agent, 233s If-None-Match, etc. If None, pool headers are used. If provided, 233s these headers completely replace any pool-specific headers. 233s 233s :param retries: 233s Configure the number of retries to allow before raising a 233s :class:`~urllib3.exceptions.MaxRetryError` exception. 233s 233s Pass ``None`` to retry until you receive a response. Pass a 233s :class:`~urllib3.util.retry.Retry` object for fine-grained control 233s over different types of retries. 233s Pass an integer number to retry connection errors that many times, 233s but no other types of errors. Pass zero to never retry. 233s 233s If ``False``, then retries are disabled and any exception is raised 233s immediately. Also, instead of raising a MaxRetryError on redirects, 233s the redirect response will be returned. 233s 233s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 233s 233s :param timeout: 233s If specified, overrides the default timeout for this one 233s request. It may be a float (in seconds) or an instance of 233s :class:`urllib3.util.Timeout`. 233s 233s :param chunked: 233s If True, urllib3 will send the body using chunked transfer 233s encoding. Otherwise, urllib3 will send the body using the standard 233s content-length form. Defaults to False. 233s 233s :param response_conn: 233s Set this to ``None`` if you will handle releasing the connection or 233s set the connection to have the response release it. 233s 233s :param preload_content: 233s If True, the response's body will be preloaded during construction. 233s 233s :param decode_content: 233s If True, will attempt to decode the body based on the 233s 'content-encoding' header. 233s 233s :param enforce_content_length: 233s Enforce content length checking. Body returned by server must match 233s value of Content-Length header, if present. Otherwise, raise error. 233s """ 233s self.num_requests += 1 233s 233s timeout_obj = self._get_timeout(timeout) 233s timeout_obj.start_connect() 233s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 233s 233s try: 233s # Trigger any extra validation we need to do. 233s try: 233s self._validate_conn(conn) 233s except (SocketTimeout, BaseSSLError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 233s raise 233s 233s # _validate_conn() starts the connection to an HTTPS proxy 233s # so we need to wrap errors with 'ProxyError' here too. 233s except ( 233s OSError, 233s NewConnectionError, 233s TimeoutError, 233s BaseSSLError, 233s CertificateError, 233s SSLError, 233s ) as e: 233s new_e: Exception = e 233s if isinstance(e, (BaseSSLError, CertificateError)): 233s new_e = SSLError(e) 233s # If the connection didn't successfully connect to it's proxy 233s # then there 233s if isinstance( 233s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 233s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 233s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 233s raise new_e 233s 233s # conn.request() calls http.client.*.request, not the method in 233s # urllib3.request. It also calls makefile (recv) on the socket. 233s try: 233s conn.request( 233s method, 233s url, 233s body=body, 233s headers=headers, 233s chunked=chunked, 233s preload_content=preload_content, 233s decode_content=decode_content, 233s enforce_content_length=enforce_content_length, 233s ) 233s 233s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 233s # legitimately able to close the connection after sending a valid response. 233s # With this behaviour, the received response is still readable. 233s except BrokenPipeError: 233s pass 233s except OSError as e: 233s # MacOS/Linux 233s # EPROTOTYPE and ECONNRESET are needed on macOS 233s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 233s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 233s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 233s raise 233s 233s # Reset the timeout for the recv() on the socket 233s read_timeout = timeout_obj.read_timeout 233s 233s if not conn.is_closed: 233s # In Python 3 socket.py will catch EAGAIN and return None when you 233s # try and read into the file pointer created by http.client, which 233s # instead raises a BadStatusLine exception. Instead of catching 233s # the exception and assuming all BadStatusLine exceptions are read 233s # timeouts, check for a zero timeout before making the request. 233s if read_timeout == 0: 233s raise ReadTimeoutError( 233s self, url, f"Read timed out. (read timeout={read_timeout})" 233s ) 233s conn.timeout = read_timeout 233s 233s # Receive the response from the server 233s try: 233s response = conn.getresponse() 233s except (BaseSSLError, OSError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 233s raise 233s 233s # Set properties that are used by the pooling layer. 233s response.retries = retries 233s response._connection = response_conn # type: ignore[attr-defined] 233s response._pool = self # type: ignore[attr-defined] 233s 233s log.debug( 233s '%s://%s:%s "%s %s %s" %s %s', 233s self.scheme, 233s self.host, 233s self.port, 233s method, 233s url, 233s > response.version_string, 233s response.status, 233s response.length_remaining, 233s ) 233s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 233s 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 233s ----------------------------- Captured stderr call ----------------------------- 233s 127.0.0.1 - - [18/Jan/2025 02:31:41] "GET / HTTP/1.1" 200 9358 233s _____________________________ test_domain_redirect _____________________________ 233s 233s def test_domain_redirect(): 233s """Ensure that redirects across domains are considered unique""" 233s # In this example, seomoz.org redirects to moz.com, and if those 233s # requests are considered identical, then we'll be stuck in a redirect 233s # loop. 233s url = "http://seomoz.org/" 233s with vcr.use_cassette("tests/fixtures/wild/domain_redirect.yaml") as cass: 233s > requests.get(url, headers={"User-Agent": "vcrpy-test"}) 233s 233s tests/integration/test_wild.py:20: 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s /usr/lib/python3/dist-packages/requests/api.py:73: in get 233s return request("get", url, params=params, **kwargs) 233s /usr/lib/python3/dist-packages/requests/api.py:59: in request 233s return session.request(method=method, url=url, **kwargs) 233s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 233s resp = self.send(prep, **send_kwargs) 233s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 233s r = adapter.send(request, **kwargs) 233s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 233s resp = conn.urlopen( 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 233s response = self._make_request( 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s 233s self = 233s conn = 233s method = 'GET', url = '/', body = None 233s headers = {'User-Agent': 'vcrpy-test', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 233s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 233s timeout = Timeout(connect=None, read=None, total=None), chunked = False 233s response_conn = 233s preload_content = False, decode_content = False, enforce_content_length = True 233s 233s def _make_request( 233s self, 233s conn: BaseHTTPConnection, 233s method: str, 233s url: str, 233s body: _TYPE_BODY | None = None, 233s headers: typing.Mapping[str, str] | None = None, 233s retries: Retry | None = None, 233s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 233s chunked: bool = False, 233s response_conn: BaseHTTPConnection | None = None, 233s preload_content: bool = True, 233s decode_content: bool = True, 233s enforce_content_length: bool = True, 233s ) -> BaseHTTPResponse: 233s """ 233s Perform a request on a given urllib connection object taken from our 233s pool. 233s 233s :param conn: 233s a connection from one of our connection pools 233s 233s :param method: 233s HTTP request method (such as GET, POST, PUT, etc.) 233s 233s :param url: 233s The URL to perform the request on. 233s 233s :param body: 233s Data to send in the request body, either :class:`str`, :class:`bytes`, 233s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 233s 233s :param headers: 233s Dictionary of custom headers to send, such as User-Agent, 233s If-None-Match, etc. If None, pool headers are used. If provided, 233s these headers completely replace any pool-specific headers. 233s 233s :param retries: 233s Configure the number of retries to allow before raising a 233s :class:`~urllib3.exceptions.MaxRetryError` exception. 233s 233s Pass ``None`` to retry until you receive a response. Pass a 233s :class:`~urllib3.util.retry.Retry` object for fine-grained control 233s over different types of retries. 233s Pass an integer number to retry connection errors that many times, 233s but no other types of errors. Pass zero to never retry. 233s 233s If ``False``, then retries are disabled and any exception is raised 233s immediately. Also, instead of raising a MaxRetryError on redirects, 233s the redirect response will be returned. 233s 233s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 233s 233s :param timeout: 233s If specified, overrides the default timeout for this one 233s request. It may be a float (in seconds) or an instance of 233s :class:`urllib3.util.Timeout`. 233s 233s :param chunked: 233s If True, urllib3 will send the body using chunked transfer 233s encoding. Otherwise, urllib3 will send the body using the standard 233s content-length form. Defaults to False. 233s 233s :param response_conn: 233s Set this to ``None`` if you will handle releasing the connection or 233s set the connection to have the response release it. 233s 233s :param preload_content: 233s If True, the response's body will be preloaded during construction. 233s 233s :param decode_content: 233s If True, will attempt to decode the body based on the 233s 'content-encoding' header. 233s 233s :param enforce_content_length: 233s Enforce content length checking. Body returned by server must match 233s value of Content-Length header, if present. Otherwise, raise error. 233s """ 233s self.num_requests += 1 233s 233s timeout_obj = self._get_timeout(timeout) 233s timeout_obj.start_connect() 233s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 233s 233s try: 233s # Trigger any extra validation we need to do. 233s try: 233s self._validate_conn(conn) 233s except (SocketTimeout, BaseSSLError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 233s raise 233s 233s # _validate_conn() starts the connection to an HTTPS proxy 233s # so we need to wrap errors with 'ProxyError' here too. 233s except ( 233s OSError, 233s NewConnectionError, 233s TimeoutError, 233s BaseSSLError, 233s CertificateError, 233s SSLError, 233s ) as e: 233s new_e: Exception = e 233s if isinstance(e, (BaseSSLError, CertificateError)): 233s new_e = SSLError(e) 233s # If the connection didn't successfully connect to it's proxy 233s # then there 233s if isinstance( 233s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 233s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 233s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 233s raise new_e 233s 233s # conn.request() calls http.client.*.request, not the method in 233s # urllib3.request. It also calls makefile (recv) on the socket. 233s try: 233s conn.request( 233s method, 233s url, 233s body=body, 233s headers=headers, 233s chunked=chunked, 233s preload_content=preload_content, 233s decode_content=decode_content, 233s enforce_content_length=enforce_content_length, 233s ) 233s 233s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 233s # legitimately able to close the connection after sending a valid response. 233s # With this behaviour, the received response is still readable. 233s except BrokenPipeError: 233s pass 233s except OSError as e: 233s # MacOS/Linux 233s # EPROTOTYPE and ECONNRESET are needed on macOS 233s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 233s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 233s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 233s raise 233s 233s # Reset the timeout for the recv() on the socket 233s read_timeout = timeout_obj.read_timeout 233s 233s if not conn.is_closed: 233s # In Python 3 socket.py will catch EAGAIN and return None when you 233s # try and read into the file pointer created by http.client, which 233s # instead raises a BadStatusLine exception. Instead of catching 233s # the exception and assuming all BadStatusLine exceptions are read 233s # timeouts, check for a zero timeout before making the request. 233s if read_timeout == 0: 233s raise ReadTimeoutError( 233s self, url, f"Read timed out. (read timeout={read_timeout})" 233s ) 233s conn.timeout = read_timeout 233s 233s # Receive the response from the server 233s try: 233s response = conn.getresponse() 233s except (BaseSSLError, OSError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 233s raise 233s 233s # Set properties that are used by the pooling layer. 233s response.retries = retries 233s response._connection = response_conn # type: ignore[attr-defined] 233s response._pool = self # type: ignore[attr-defined] 233s 233s log.debug( 233s '%s://%s:%s "%s %s %s" %s %s', 233s self.scheme, 233s self.host, 233s self.port, 233s method, 233s url, 233s > response.version_string, 233s response.status, 233s response.length_remaining, 233s ) 233s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 233s 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 233s _________________________________ test_cookies _________________________________ 233s 233s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_cookies0') 233s httpbin = 233s 233s def test_cookies(tmpdir, httpbin): 233s testfile = str(tmpdir.join("cookies.yml")) 233s with vcr.use_cassette(testfile): 233s with requests.Session() as s: 233s > s.get(httpbin.url + "/cookies/set?k1=v1&k2=v2") 233s 233s tests/integration/test_wild.py:67: 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 233s return self.request("GET", url, **kwargs) 233s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 233s resp = self.send(prep, **send_kwargs) 233s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 233s r = adapter.send(request, **kwargs) 233s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 233s resp = conn.urlopen( 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 233s response = self._make_request( 233s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 233s 233s self = 233s conn = 233s method = 'GET', url = '/cookies/set?k1=v1&k2=v2', body = None 233s headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 233s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 233s timeout = Timeout(connect=None, read=None, total=None), chunked = False 233s response_conn = 233s preload_content = False, decode_content = False, enforce_content_length = True 233s 233s def _make_request( 233s self, 233s conn: BaseHTTPConnection, 233s method: str, 233s url: str, 233s body: _TYPE_BODY | None = None, 233s headers: typing.Mapping[str, str] | None = None, 233s retries: Retry | None = None, 233s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 233s chunked: bool = False, 233s response_conn: BaseHTTPConnection | None = None, 233s preload_content: bool = True, 233s decode_content: bool = True, 233s enforce_content_length: bool = True, 233s ) -> BaseHTTPResponse: 233s """ 233s Perform a request on a given urllib connection object taken from our 233s pool. 233s 233s :param conn: 233s a connection from one of our connection pools 233s 233s :param method: 233s HTTP request method (such as GET, POST, PUT, etc.) 233s 233s :param url: 233s The URL to perform the request on. 233s 233s :param body: 233s Data to send in the request body, either :class:`str`, :class:`bytes`, 233s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 233s 233s :param headers: 233s Dictionary of custom headers to send, such as User-Agent, 233s If-None-Match, etc. If None, pool headers are used. If provided, 233s these headers completely replace any pool-specific headers. 233s 233s :param retries: 233s Configure the number of retries to allow before raising a 233s :class:`~urllib3.exceptions.MaxRetryError` exception. 233s 233s Pass ``None`` to retry until you receive a response. Pass a 233s :class:`~urllib3.util.retry.Retry` object for fine-grained control 233s over different types of retries. 233s Pass an integer number to retry connection errors that many times, 233s but no other types of errors. Pass zero to never retry. 233s 233s If ``False``, then retries are disabled and any exception is raised 233s immediately. Also, instead of raising a MaxRetryError on redirects, 233s the redirect response will be returned. 233s 233s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 233s 233s :param timeout: 233s If specified, overrides the default timeout for this one 233s request. It may be a float (in seconds) or an instance of 233s :class:`urllib3.util.Timeout`. 233s 233s :param chunked: 233s If True, urllib3 will send the body using chunked transfer 233s encoding. Otherwise, urllib3 will send the body using the standard 233s content-length form. Defaults to False. 233s 233s :param response_conn: 233s Set this to ``None`` if you will handle releasing the connection or 233s set the connection to have the response release it. 233s 233s :param preload_content: 233s If True, the response's body will be preloaded during construction. 233s 233s :param decode_content: 233s If True, will attempt to decode the body based on the 233s 'content-encoding' header. 233s 233s :param enforce_content_length: 233s Enforce content length checking. Body returned by server must match 233s value of Content-Length header, if present. Otherwise, raise error. 233s """ 233s self.num_requests += 1 233s 233s timeout_obj = self._get_timeout(timeout) 233s timeout_obj.start_connect() 233s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 233s 233s try: 233s # Trigger any extra validation we need to do. 233s try: 233s self._validate_conn(conn) 233s except (SocketTimeout, BaseSSLError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 233s raise 233s 233s # _validate_conn() starts the connection to an HTTPS proxy 233s # so we need to wrap errors with 'ProxyError' here too. 233s except ( 233s OSError, 233s NewConnectionError, 233s TimeoutError, 233s BaseSSLError, 233s CertificateError, 233s SSLError, 233s ) as e: 233s new_e: Exception = e 233s if isinstance(e, (BaseSSLError, CertificateError)): 233s new_e = SSLError(e) 233s # If the connection didn't successfully connect to it's proxy 233s # then there 233s if isinstance( 233s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 233s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 233s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 233s raise new_e 233s 233s # conn.request() calls http.client.*.request, not the method in 233s # urllib3.request. It also calls makefile (recv) on the socket. 233s try: 233s conn.request( 233s method, 233s url, 233s body=body, 233s headers=headers, 233s chunked=chunked, 233s preload_content=preload_content, 233s decode_content=decode_content, 233s enforce_content_length=enforce_content_length, 233s ) 233s 233s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 233s # legitimately able to close the connection after sending a valid response. 233s # With this behaviour, the received response is still readable. 233s except BrokenPipeError: 233s pass 233s except OSError as e: 233s # MacOS/Linux 233s # EPROTOTYPE and ECONNRESET are needed on macOS 233s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 233s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 233s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 233s raise 233s 233s # Reset the timeout for the recv() on the socket 233s read_timeout = timeout_obj.read_timeout 233s 233s if not conn.is_closed: 233s # In Python 3 socket.py will catch EAGAIN and return None when you 233s # try and read into the file pointer created by http.client, which 233s # instead raises a BadStatusLine exception. Instead of catching 233s # the exception and assuming all BadStatusLine exceptions are read 233s # timeouts, check for a zero timeout before making the request. 233s if read_timeout == 0: 233s raise ReadTimeoutError( 233s self, url, f"Read timed out. (read timeout={read_timeout})" 233s ) 233s conn.timeout = read_timeout 233s 233s # Receive the response from the server 233s try: 233s response = conn.getresponse() 233s except (BaseSSLError, OSError) as e: 233s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 233s raise 233s 233s # Set properties that are used by the pooling layer. 233s response.retries = retries 233s response._connection = response_conn # type: ignore[attr-defined] 233s response._pool = self # type: ignore[attr-defined] 233s 233s log.debug( 233s '%s://%s:%s "%s %s %s" %s %s', 233s self.scheme, 233s self.host, 233s self.port, 233s method, 233s url, 233s > response.version_string, 233s response.status, 233s response.length_remaining, 233s ) 233s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 233s 233s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 233s ----------------------------- Captured stderr call ----------------------------- 233s 127.0.0.1 - - [18/Jan/2025 02:31:41] "GET /cookies/set?k1=v1&k2=v2 HTTP/1.1" 302 203 233s =============================== warnings summary =============================== 233s tests/integration/test_config.py:10 233s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_config.py:10: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 233s @pytest.mark.online 233s 233s tests/integration/test_config.py:24 233s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_config.py:24: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 233s @pytest.mark.online 233s 233s tests/integration/test_config.py:34 233s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_config.py:34: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 233s @pytest.mark.online 233s 233s tests/integration/test_config.py:47 233s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_config.py:47: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 233s @pytest.mark.online 233s 233s tests/integration/test_config.py:69 233s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_config.py:69: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 233s @pytest.mark.online 233s 233s tests/integration/test_disksaver.py:14 233s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_disksaver.py:14: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 233s @pytest.mark.online 233s 233s tests/integration/test_disksaver.py:35 233s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_disksaver.py:35: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 233s @pytest.mark.online 233s 233s tests/integration/test_httplib2.py:60 233s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_httplib2.py:60: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 233s @pytest.mark.online 233s 233s tests/integration/test_register_matcher.py:16 233s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_register_matcher.py:16: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 233s @pytest.mark.online 233s 233s tests/integration/test_register_matcher.py:32 233s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_register_matcher.py:32: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 233s @pytest.mark.online 233s 233s tests/integration/test_urllib2.py:60 233s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_urllib2.py:60: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 233s @mark.online 233s 233s tests/integration/test_urllib3.py:102 233s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_urllib3.py:102: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 233s @pytest.mark.online 233s 233s tests/integration/test_wild.py:55 233s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_wild.py:55: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 233s @pytest.mark.online 233s 233s tests/integration/test_wild.py:74 233s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_wild.py:74: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 233s @pytest.mark.online 233s 233s tests/unit/test_stubs.py:20 233s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/unit/test_stubs.py:20: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 233s @mark.online 233s 233s tests/unit/test_unittest.py:131 233s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/unit/test_unittest.py:131: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 233s @pytest.mark.online 233s 233s tests/unit/test_unittest.py:166 233s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/unit/test_unittest.py:166: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 233s @pytest.mark.online 233s 233s tests/integration/test_wild.py::test_xmlrpclib 233s /usr/lib/python3.13/multiprocessing/popen_fork.py:67: DeprecationWarning: This process (pid=2896) is multi-threaded, use of fork() may lead to deadlocks in the child. 233s self.pid = os.fork() 233s 233s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 233s =========================== short test summary info ============================ 233s FAILED tests/integration/test_urllib3.py::test_status_code[http] - AttributeE... 233s FAILED tests/integration/test_urllib3.py::test_headers[http] - AttributeError... 233s FAILED tests/integration/test_urllib3.py::test_body[http] - AttributeError: '... 233s FAILED tests/integration/test_urllib3.py::test_auth[http] - AttributeError: '... 233s FAILED tests/integration/test_urllib3.py::test_auth_failed[http] - AttributeE... 233s FAILED tests/integration/test_urllib3.py::test_post[http] - AttributeError: '... 233s FAILED tests/integration/test_urllib3.py::test_gzip[http] - AttributeError: '... 233s FAILED tests/integration/test_urllib3.py::test_status_code[https] - Attribute... 233s FAILED tests/integration/test_urllib3.py::test_headers[https] - AttributeErro... 233s FAILED tests/integration/test_urllib3.py::test_body[https] - AttributeError: ... 233s FAILED tests/integration/test_urllib3.py::test_auth[https] - AttributeError: ... 233s FAILED tests/integration/test_urllib3.py::test_auth_failed[https] - Attribute... 233s FAILED tests/integration/test_urllib3.py::test_post[https] - AttributeError: ... 233s FAILED tests/integration/test_urllib3.py::test_gzip[https] - AttributeError: ... 233s FAILED tests/integration/test_proxy.py::test_use_proxy - AttributeError: 'VCR... 233s FAILED tests/integration/test_urllib3.py::test_cross_scheme - AttributeError:... 233s FAILED tests/integration/test_urllib3.py::test_https_with_cert_validation_disabled 233s FAILED tests/integration/test_wild.py::test_domain_redirect - AttributeError:... 233s FAILED tests/integration/test_wild.py::test_cookies - AttributeError: 'VCRHTT... 233s ==== 19 failed, 265 passed, 3 skipped, 19 deselected, 18 warnings in 4.77s ===== 233s E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build; python3.13 -m pytest --ignore tests/integration/test_aiohttp.py --ignore tests/integration/test_tornado.py --ignore tests/integration/test_requests.py -m "not online" -k "not test_basic_json_use and not test_load_cassette_with_custom_persister" 233s I: pybuild base:311: cd /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build; python3.12 -m pytest --ignore tests/integration/test_aiohttp.py --ignore tests/integration/test_tornado.py --ignore tests/integration/test_requests.py -m "not online" -k "not test_basic_json_use and not test_load_cassette_with_custom_persister" 234s ============================= test session starts ============================== 234s platform linux -- Python 3.12.8, pytest-8.3.4, pluggy-1.5.0 234s rootdir: /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build 234s plugins: tornado-0.8.1, httpbin-2.1.0, typeguard-4.4.1 234s collected 305 items / 19 deselected / 1 skipped / 286 selected 234s 235s tests/integration/test_basic.py .... [ 1%] 235s tests/integration/test_boto3.py ss [ 2%] 235s tests/integration/test_config.py . [ 2%] 235s tests/integration/test_filter.py .......... [ 5%] 235s tests/integration/test_httplib2.py ........ [ 8%] 235s tests/integration/test_urllib2.py ........ [ 11%] 235s tests/integration/test_urllib3.py FFFFFFF [ 13%] 235s tests/integration/test_httplib2.py ........ [ 16%] 236s tests/integration/test_urllib2.py ........ [ 19%] 236s tests/integration/test_urllib3.py FFFFFFF [ 22%] 236s tests/integration/test_httplib2.py . [ 22%] 236s tests/integration/test_ignore.py .... [ 23%] 236s tests/integration/test_matchers.py .............. [ 28%] 236s tests/integration/test_multiple.py . [ 29%] 236s tests/integration/test_proxy.py F [ 29%] 236s tests/integration/test_record_mode.py ........ [ 32%] 236s tests/integration/test_register_persister.py .. [ 32%] 236s tests/integration/test_register_serializer.py . [ 33%] 236s tests/integration/test_request.py .. [ 33%] 236s tests/integration/test_stubs.py .... [ 35%] 236s tests/integration/test_urllib2.py . [ 35%] 237s tests/integration/test_urllib3.py FF. [ 36%] 237s tests/integration/test_wild.py F.F. [ 38%] 237s tests/unit/test_cassettes.py ............................... [ 48%] 237s tests/unit/test_errors.py .... [ 50%] 237s tests/unit/test_filters.py ........................ [ 58%] 237s tests/unit/test_json_serializer.py . [ 59%] 237s tests/unit/test_matchers.py ............................ [ 68%] 237s tests/unit/test_migration.py ... [ 69%] 237s tests/unit/test_persist.py .... [ 71%] 237s tests/unit/test_request.py ................. [ 77%] 237s tests/unit/test_response.py .... [ 78%] 237s tests/unit/test_serialize.py ............... [ 83%] 237s tests/unit/test_stubs.py ... [ 84%] 237s tests/unit/test_unittest.py ....... [ 87%] 237s tests/unit/test_util.py ........... [ 91%] 237s tests/unit/test_vcr.py ........................ [ 99%] 238s tests/unit/test_vcr_import.py . [100%] 238s 238s =================================== FAILURES =================================== 238s ____________________________ test_status_code[http] ____________________________ 238s 238s httpbin_both = 238s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_status_code_http_0') 238s verify_pool_mgr = 238s 238s def test_status_code(httpbin_both, tmpdir, verify_pool_mgr): 238s """Ensure that we can read the status code""" 238s url = httpbin_both.url 238s with vcr.use_cassette(str(tmpdir.join("atts.yaml"))): 238s > status_code = verify_pool_mgr.request("GET", url).status 238s 238s tests/integration/test_urllib3.py:34: 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 238s return self.request_encode_url( 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 238s return self.urlopen(method, url, **extra_kw) 238s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 238s response = conn.urlopen(method, u.request_uri, **kw) 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 238s response = self._make_request( 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s 238s self = 238s conn = 238s method = 'GET', url = '/', body = None, headers = {} 238s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 238s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 238s chunked = False, response_conn = None, preload_content = True 238s decode_content = True, enforce_content_length = True 238s 238s def _make_request( 238s self, 238s conn: BaseHTTPConnection, 238s method: str, 238s url: str, 238s body: _TYPE_BODY | None = None, 238s headers: typing.Mapping[str, str] | None = None, 238s retries: Retry | None = None, 238s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 238s chunked: bool = False, 238s response_conn: BaseHTTPConnection | None = None, 238s preload_content: bool = True, 238s decode_content: bool = True, 238s enforce_content_length: bool = True, 238s ) -> BaseHTTPResponse: 238s """ 238s Perform a request on a given urllib connection object taken from our 238s pool. 238s 238s :param conn: 238s a connection from one of our connection pools 238s 238s :param method: 238s HTTP request method (such as GET, POST, PUT, etc.) 238s 238s :param url: 238s The URL to perform the request on. 238s 238s :param body: 238s Data to send in the request body, either :class:`str`, :class:`bytes`, 238s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 238s 238s :param headers: 238s Dictionary of custom headers to send, such as User-Agent, 238s If-None-Match, etc. If None, pool headers are used. If provided, 238s these headers completely replace any pool-specific headers. 238s 238s :param retries: 238s Configure the number of retries to allow before raising a 238s :class:`~urllib3.exceptions.MaxRetryError` exception. 238s 238s Pass ``None`` to retry until you receive a response. Pass a 238s :class:`~urllib3.util.retry.Retry` object for fine-grained control 238s over different types of retries. 238s Pass an integer number to retry connection errors that many times, 238s but no other types of errors. Pass zero to never retry. 238s 238s If ``False``, then retries are disabled and any exception is raised 238s immediately. Also, instead of raising a MaxRetryError on redirects, 238s the redirect response will be returned. 238s 238s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 238s 238s :param timeout: 238s If specified, overrides the default timeout for this one 238s request. It may be a float (in seconds) or an instance of 238s :class:`urllib3.util.Timeout`. 238s 238s :param chunked: 238s If True, urllib3 will send the body using chunked transfer 238s encoding. Otherwise, urllib3 will send the body using the standard 238s content-length form. Defaults to False. 238s 238s :param response_conn: 238s Set this to ``None`` if you will handle releasing the connection or 238s set the connection to have the response release it. 238s 238s :param preload_content: 238s If True, the response's body will be preloaded during construction. 238s 238s :param decode_content: 238s If True, will attempt to decode the body based on the 238s 'content-encoding' header. 238s 238s :param enforce_content_length: 238s Enforce content length checking. Body returned by server must match 238s value of Content-Length header, if present. Otherwise, raise error. 238s """ 238s self.num_requests += 1 238s 238s timeout_obj = self._get_timeout(timeout) 238s timeout_obj.start_connect() 238s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 238s 238s try: 238s # Trigger any extra validation we need to do. 238s try: 238s self._validate_conn(conn) 238s except (SocketTimeout, BaseSSLError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 238s raise 238s 238s # _validate_conn() starts the connection to an HTTPS proxy 238s # so we need to wrap errors with 'ProxyError' here too. 238s except ( 238s OSError, 238s NewConnectionError, 238s TimeoutError, 238s BaseSSLError, 238s CertificateError, 238s SSLError, 238s ) as e: 238s new_e: Exception = e 238s if isinstance(e, (BaseSSLError, CertificateError)): 238s new_e = SSLError(e) 238s # If the connection didn't successfully connect to it's proxy 238s # then there 238s if isinstance( 238s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 238s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 238s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 238s raise new_e 238s 238s # conn.request() calls http.client.*.request, not the method in 238s # urllib3.request. It also calls makefile (recv) on the socket. 238s try: 238s conn.request( 238s method, 238s url, 238s body=body, 238s headers=headers, 238s chunked=chunked, 238s preload_content=preload_content, 238s decode_content=decode_content, 238s enforce_content_length=enforce_content_length, 238s ) 238s 238s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 238s # legitimately able to close the connection after sending a valid response. 238s # With this behaviour, the received response is still readable. 238s except BrokenPipeError: 238s pass 238s except OSError as e: 238s # MacOS/Linux 238s # EPROTOTYPE and ECONNRESET are needed on macOS 238s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 238s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 238s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 238s raise 238s 238s # Reset the timeout for the recv() on the socket 238s read_timeout = timeout_obj.read_timeout 238s 238s if not conn.is_closed: 238s # In Python 3 socket.py will catch EAGAIN and return None when you 238s # try and read into the file pointer created by http.client, which 238s # instead raises a BadStatusLine exception. Instead of catching 238s # the exception and assuming all BadStatusLine exceptions are read 238s # timeouts, check for a zero timeout before making the request. 238s if read_timeout == 0: 238s raise ReadTimeoutError( 238s self, url, f"Read timed out. (read timeout={read_timeout})" 238s ) 238s conn.timeout = read_timeout 238s 238s # Receive the response from the server 238s try: 238s response = conn.getresponse() 238s except (BaseSSLError, OSError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 238s raise 238s 238s # Set properties that are used by the pooling layer. 238s response.retries = retries 238s response._connection = response_conn # type: ignore[attr-defined] 238s response._pool = self # type: ignore[attr-defined] 238s 238s log.debug( 238s '%s://%s:%s "%s %s %s" %s %s', 238s self.scheme, 238s self.host, 238s self.port, 238s method, 238s url, 238s > response.version_string, 238s response.status, 238s response.length_remaining, 238s ) 238s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 238s 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 238s ----------------------------- Captured stderr call ----------------------------- 238s 127.0.0.1 - - [18/Jan/2025 02:31:45] "GET / HTTP/1.1" 200 9358 238s ______________________________ test_headers[http] ______________________________ 238s 238s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_headers_http_0') 238s httpbin_both = 238s verify_pool_mgr = 238s 238s def test_headers(tmpdir, httpbin_both, verify_pool_mgr): 238s """Ensure that we can read the headers back""" 238s url = httpbin_both.url 238s with vcr.use_cassette(str(tmpdir.join("headers.yaml"))): 238s > headers = verify_pool_mgr.request("GET", url).headers 238s 238s tests/integration/test_urllib3.py:44: 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 238s return self.request_encode_url( 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 238s return self.urlopen(method, url, **extra_kw) 238s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 238s response = conn.urlopen(method, u.request_uri, **kw) 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 238s response = self._make_request( 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s 238s self = 238s conn = 238s method = 'GET', url = '/', body = None, headers = {} 238s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 238s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 238s chunked = False, response_conn = None, preload_content = True 238s decode_content = True, enforce_content_length = True 238s 238s def _make_request( 238s self, 238s conn: BaseHTTPConnection, 238s method: str, 238s url: str, 238s body: _TYPE_BODY | None = None, 238s headers: typing.Mapping[str, str] | None = None, 238s retries: Retry | None = None, 238s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 238s chunked: bool = False, 238s response_conn: BaseHTTPConnection | None = None, 238s preload_content: bool = True, 238s decode_content: bool = True, 238s enforce_content_length: bool = True, 238s ) -> BaseHTTPResponse: 238s """ 238s Perform a request on a given urllib connection object taken from our 238s pool. 238s 238s :param conn: 238s a connection from one of our connection pools 238s 238s :param method: 238s HTTP request method (such as GET, POST, PUT, etc.) 238s 238s :param url: 238s The URL to perform the request on. 238s 238s :param body: 238s Data to send in the request body, either :class:`str`, :class:`bytes`, 238s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 238s 238s :param headers: 238s Dictionary of custom headers to send, such as User-Agent, 238s If-None-Match, etc. If None, pool headers are used. If provided, 238s these headers completely replace any pool-specific headers. 238s 238s :param retries: 238s Configure the number of retries to allow before raising a 238s :class:`~urllib3.exceptions.MaxRetryError` exception. 238s 238s Pass ``None`` to retry until you receive a response. Pass a 238s :class:`~urllib3.util.retry.Retry` object for fine-grained control 238s over different types of retries. 238s Pass an integer number to retry connection errors that many times, 238s but no other types of errors. Pass zero to never retry. 238s 238s If ``False``, then retries are disabled and any exception is raised 238s immediately. Also, instead of raising a MaxRetryError on redirects, 238s the redirect response will be returned. 238s 238s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 238s 238s :param timeout: 238s If specified, overrides the default timeout for this one 238s request. It may be a float (in seconds) or an instance of 238s :class:`urllib3.util.Timeout`. 238s 238s :param chunked: 238s If True, urllib3 will send the body using chunked transfer 238s encoding. Otherwise, urllib3 will send the body using the standard 238s content-length form. Defaults to False. 238s 238s :param response_conn: 238s Set this to ``None`` if you will handle releasing the connection or 238s set the connection to have the response release it. 238s 238s :param preload_content: 238s If True, the response's body will be preloaded during construction. 238s 238s :param decode_content: 238s If True, will attempt to decode the body based on the 238s 'content-encoding' header. 238s 238s :param enforce_content_length: 238s Enforce content length checking. Body returned by server must match 238s value of Content-Length header, if present. Otherwise, raise error. 238s """ 238s self.num_requests += 1 238s 238s timeout_obj = self._get_timeout(timeout) 238s timeout_obj.start_connect() 238s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 238s 238s try: 238s # Trigger any extra validation we need to do. 238s try: 238s self._validate_conn(conn) 238s except (SocketTimeout, BaseSSLError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 238s raise 238s 238s # _validate_conn() starts the connection to an HTTPS proxy 238s # so we need to wrap errors with 'ProxyError' here too. 238s except ( 238s OSError, 238s NewConnectionError, 238s TimeoutError, 238s BaseSSLError, 238s CertificateError, 238s SSLError, 238s ) as e: 238s new_e: Exception = e 238s if isinstance(e, (BaseSSLError, CertificateError)): 238s new_e = SSLError(e) 238s # If the connection didn't successfully connect to it's proxy 238s # then there 238s if isinstance( 238s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 238s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 238s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 238s raise new_e 238s 238s # conn.request() calls http.client.*.request, not the method in 238s # urllib3.request. It also calls makefile (recv) on the socket. 238s try: 238s conn.request( 238s method, 238s url, 238s body=body, 238s headers=headers, 238s chunked=chunked, 238s preload_content=preload_content, 238s decode_content=decode_content, 238s enforce_content_length=enforce_content_length, 238s ) 238s 238s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 238s # legitimately able to close the connection after sending a valid response. 238s # With this behaviour, the received response is still readable. 238s except BrokenPipeError: 238s pass 238s except OSError as e: 238s # MacOS/Linux 238s # EPROTOTYPE and ECONNRESET are needed on macOS 238s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 238s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 238s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 238s raise 238s 238s # Reset the timeout for the recv() on the socket 238s read_timeout = timeout_obj.read_timeout 238s 238s if not conn.is_closed: 238s # In Python 3 socket.py will catch EAGAIN and return None when you 238s # try and read into the file pointer created by http.client, which 238s # instead raises a BadStatusLine exception. Instead of catching 238s # the exception and assuming all BadStatusLine exceptions are read 238s # timeouts, check for a zero timeout before making the request. 238s if read_timeout == 0: 238s raise ReadTimeoutError( 238s self, url, f"Read timed out. (read timeout={read_timeout})" 238s ) 238s conn.timeout = read_timeout 238s 238s # Receive the response from the server 238s try: 238s response = conn.getresponse() 238s except (BaseSSLError, OSError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 238s raise 238s 238s # Set properties that are used by the pooling layer. 238s response.retries = retries 238s response._connection = response_conn # type: ignore[attr-defined] 238s response._pool = self # type: ignore[attr-defined] 238s 238s log.debug( 238s '%s://%s:%s "%s %s %s" %s %s', 238s self.scheme, 238s self.host, 238s self.port, 238s method, 238s url, 238s > response.version_string, 238s response.status, 238s response.length_remaining, 238s ) 238s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 238s 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 238s ----------------------------- Captured stderr call ----------------------------- 238s 127.0.0.1 - - [18/Jan/2025 02:31:45] "GET / HTTP/1.1" 200 9358 238s _______________________________ test_body[http] ________________________________ 238s 238s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_body_http_0') 238s httpbin_both = 238s verify_pool_mgr = 238s 238s def test_body(tmpdir, httpbin_both, verify_pool_mgr): 238s """Ensure the responses are all identical enough""" 238s url = httpbin_both.url + "/bytes/1024" 238s with vcr.use_cassette(str(tmpdir.join("body.yaml"))): 238s > content = verify_pool_mgr.request("GET", url).data 238s 238s tests/integration/test_urllib3.py:55: 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 238s return self.request_encode_url( 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 238s return self.urlopen(method, url, **extra_kw) 238s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 238s response = conn.urlopen(method, u.request_uri, **kw) 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 238s response = self._make_request( 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s 238s self = 238s conn = 238s method = 'GET', url = '/bytes/1024', body = None, headers = {} 238s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 238s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 238s chunked = False, response_conn = None, preload_content = True 238s decode_content = True, enforce_content_length = True 238s 238s def _make_request( 238s self, 238s conn: BaseHTTPConnection, 238s method: str, 238s url: str, 238s body: _TYPE_BODY | None = None, 238s headers: typing.Mapping[str, str] | None = None, 238s retries: Retry | None = None, 238s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 238s chunked: bool = False, 238s response_conn: BaseHTTPConnection | None = None, 238s preload_content: bool = True, 238s decode_content: bool = True, 238s enforce_content_length: bool = True, 238s ) -> BaseHTTPResponse: 238s """ 238s Perform a request on a given urllib connection object taken from our 238s pool. 238s 238s :param conn: 238s a connection from one of our connection pools 238s 238s :param method: 238s HTTP request method (such as GET, POST, PUT, etc.) 238s 238s :param url: 238s The URL to perform the request on. 238s 238s :param body: 238s Data to send in the request body, either :class:`str`, :class:`bytes`, 238s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 238s 238s :param headers: 238s Dictionary of custom headers to send, such as User-Agent, 238s If-None-Match, etc. If None, pool headers are used. If provided, 238s these headers completely replace any pool-specific headers. 238s 238s :param retries: 238s Configure the number of retries to allow before raising a 238s :class:`~urllib3.exceptions.MaxRetryError` exception. 238s 238s Pass ``None`` to retry until you receive a response. Pass a 238s :class:`~urllib3.util.retry.Retry` object for fine-grained control 238s over different types of retries. 238s Pass an integer number to retry connection errors that many times, 238s but no other types of errors. Pass zero to never retry. 238s 238s If ``False``, then retries are disabled and any exception is raised 238s immediately. Also, instead of raising a MaxRetryError on redirects, 238s the redirect response will be returned. 238s 238s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 238s 238s :param timeout: 238s If specified, overrides the default timeout for this one 238s request. It may be a float (in seconds) or an instance of 238s :class:`urllib3.util.Timeout`. 238s 238s :param chunked: 238s If True, urllib3 will send the body using chunked transfer 238s encoding. Otherwise, urllib3 will send the body using the standard 238s content-length form. Defaults to False. 238s 238s :param response_conn: 238s Set this to ``None`` if you will handle releasing the connection or 238s set the connection to have the response release it. 238s 238s :param preload_content: 238s If True, the response's body will be preloaded during construction. 238s 238s :param decode_content: 238s If True, will attempt to decode the body based on the 238s 'content-encoding' header. 238s 238s :param enforce_content_length: 238s Enforce content length checking. Body returned by server must match 238s value of Content-Length header, if present. Otherwise, raise error. 238s """ 238s self.num_requests += 1 238s 238s timeout_obj = self._get_timeout(timeout) 238s timeout_obj.start_connect() 238s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 238s 238s try: 238s # Trigger any extra validation we need to do. 238s try: 238s self._validate_conn(conn) 238s except (SocketTimeout, BaseSSLError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 238s raise 238s 238s # _validate_conn() starts the connection to an HTTPS proxy 238s # so we need to wrap errors with 'ProxyError' here too. 238s except ( 238s OSError, 238s NewConnectionError, 238s TimeoutError, 238s BaseSSLError, 238s CertificateError, 238s SSLError, 238s ) as e: 238s new_e: Exception = e 238s if isinstance(e, (BaseSSLError, CertificateError)): 238s new_e = SSLError(e) 238s # If the connection didn't successfully connect to it's proxy 238s # then there 238s if isinstance( 238s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 238s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 238s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 238s raise new_e 238s 238s # conn.request() calls http.client.*.request, not the method in 238s # urllib3.request. It also calls makefile (recv) on the socket. 238s try: 238s conn.request( 238s method, 238s url, 238s body=body, 238s headers=headers, 238s chunked=chunked, 238s preload_content=preload_content, 238s decode_content=decode_content, 238s enforce_content_length=enforce_content_length, 238s ) 238s 238s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 238s # legitimately able to close the connection after sending a valid response. 238s # With this behaviour, the received response is still readable. 238s except BrokenPipeError: 238s pass 238s except OSError as e: 238s # MacOS/Linux 238s # EPROTOTYPE and ECONNRESET are needed on macOS 238s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 238s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 238s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 238s raise 238s 238s # Reset the timeout for the recv() on the socket 238s read_timeout = timeout_obj.read_timeout 238s 238s if not conn.is_closed: 238s # In Python 3 socket.py will catch EAGAIN and return None when you 238s # try and read into the file pointer created by http.client, which 238s # instead raises a BadStatusLine exception. Instead of catching 238s # the exception and assuming all BadStatusLine exceptions are read 238s # timeouts, check for a zero timeout before making the request. 238s if read_timeout == 0: 238s raise ReadTimeoutError( 238s self, url, f"Read timed out. (read timeout={read_timeout})" 238s ) 238s conn.timeout = read_timeout 238s 238s # Receive the response from the server 238s try: 238s response = conn.getresponse() 238s except (BaseSSLError, OSError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 238s raise 238s 238s # Set properties that are used by the pooling layer. 238s response.retries = retries 238s response._connection = response_conn # type: ignore[attr-defined] 238s response._pool = self # type: ignore[attr-defined] 238s 238s log.debug( 238s '%s://%s:%s "%s %s %s" %s %s', 238s self.scheme, 238s self.host, 238s self.port, 238s method, 238s url, 238s > response.version_string, 238s response.status, 238s response.length_remaining, 238s ) 238s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 238s 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 238s ----------------------------- Captured stderr call ----------------------------- 238s 127.0.0.1 - - [18/Jan/2025 02:31:45] "GET /bytes/1024 HTTP/1.1" 200 1024 238s _______________________________ test_auth[http] ________________________________ 238s 238s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_auth_http_0') 238s httpbin_both = 238s verify_pool_mgr = 238s 238s def test_auth(tmpdir, httpbin_both, verify_pool_mgr): 238s """Ensure that we can handle basic auth""" 238s auth = ("user", "passwd") 238s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 238s url = httpbin_both.url + "/basic-auth/user/passwd" 238s with vcr.use_cassette(str(tmpdir.join("auth.yaml"))): 238s > one = verify_pool_mgr.request("GET", url, headers=headers) 238s 238s tests/integration/test_urllib3.py:67: 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 238s return self.request_encode_url( 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 238s return self.urlopen(method, url, **extra_kw) 238s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 238s response = conn.urlopen(method, u.request_uri, **kw) 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 238s response = self._make_request( 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s 238s self = 238s conn = 238s method = 'GET', url = '/basic-auth/user/passwd', body = None 238s headers = {'authorization': 'Basic dXNlcjpwYXNzd2Q='} 238s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 238s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 238s chunked = False, response_conn = None, preload_content = True 238s decode_content = True, enforce_content_length = True 238s 238s def _make_request( 238s self, 238s conn: BaseHTTPConnection, 238s method: str, 238s url: str, 238s body: _TYPE_BODY | None = None, 238s headers: typing.Mapping[str, str] | None = None, 238s retries: Retry | None = None, 238s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 238s chunked: bool = False, 238s response_conn: BaseHTTPConnection | None = None, 238s preload_content: bool = True, 238s decode_content: bool = True, 238s enforce_content_length: bool = True, 238s ) -> BaseHTTPResponse: 238s """ 238s Perform a request on a given urllib connection object taken from our 238s pool. 238s 238s :param conn: 238s a connection from one of our connection pools 238s 238s :param method: 238s HTTP request method (such as GET, POST, PUT, etc.) 238s 238s :param url: 238s The URL to perform the request on. 238s 238s :param body: 238s Data to send in the request body, either :class:`str`, :class:`bytes`, 238s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 238s 238s :param headers: 238s Dictionary of custom headers to send, such as User-Agent, 238s If-None-Match, etc. If None, pool headers are used. If provided, 238s these headers completely replace any pool-specific headers. 238s 238s :param retries: 238s Configure the number of retries to allow before raising a 238s :class:`~urllib3.exceptions.MaxRetryError` exception. 238s 238s Pass ``None`` to retry until you receive a response. Pass a 238s :class:`~urllib3.util.retry.Retry` object for fine-grained control 238s over different types of retries. 238s Pass an integer number to retry connection errors that many times, 238s but no other types of errors. Pass zero to never retry. 238s 238s If ``False``, then retries are disabled and any exception is raised 238s immediately. Also, instead of raising a MaxRetryError on redirects, 238s the redirect response will be returned. 238s 238s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 238s 238s :param timeout: 238s If specified, overrides the default timeout for this one 238s request. It may be a float (in seconds) or an instance of 238s :class:`urllib3.util.Timeout`. 238s 238s :param chunked: 238s If True, urllib3 will send the body using chunked transfer 238s encoding. Otherwise, urllib3 will send the body using the standard 238s content-length form. Defaults to False. 238s 238s :param response_conn: 238s Set this to ``None`` if you will handle releasing the connection or 238s set the connection to have the response release it. 238s 238s :param preload_content: 238s If True, the response's body will be preloaded during construction. 238s 238s :param decode_content: 238s If True, will attempt to decode the body based on the 238s 'content-encoding' header. 238s 238s :param enforce_content_length: 238s Enforce content length checking. Body returned by server must match 238s value of Content-Length header, if present. Otherwise, raise error. 238s """ 238s self.num_requests += 1 238s 238s timeout_obj = self._get_timeout(timeout) 238s timeout_obj.start_connect() 238s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 238s 238s try: 238s # Trigger any extra validation we need to do. 238s try: 238s self._validate_conn(conn) 238s except (SocketTimeout, BaseSSLError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 238s raise 238s 238s # _validate_conn() starts the connection to an HTTPS proxy 238s # so we need to wrap errors with 'ProxyError' here too. 238s except ( 238s OSError, 238s NewConnectionError, 238s TimeoutError, 238s BaseSSLError, 238s CertificateError, 238s SSLError, 238s ) as e: 238s new_e: Exception = e 238s if isinstance(e, (BaseSSLError, CertificateError)): 238s new_e = SSLError(e) 238s # If the connection didn't successfully connect to it's proxy 238s # then there 238s if isinstance( 238s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 238s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 238s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 238s raise new_e 238s 238s # conn.request() calls http.client.*.request, not the method in 238s # urllib3.request. It also calls makefile (recv) on the socket. 238s try: 238s conn.request( 238s method, 238s url, 238s body=body, 238s headers=headers, 238s chunked=chunked, 238s preload_content=preload_content, 238s decode_content=decode_content, 238s enforce_content_length=enforce_content_length, 238s ) 238s 238s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 238s # legitimately able to close the connection after sending a valid response. 238s # With this behaviour, the received response is still readable. 238s except BrokenPipeError: 238s pass 238s except OSError as e: 238s # MacOS/Linux 238s # EPROTOTYPE and ECONNRESET are needed on macOS 238s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 238s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 238s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 238s raise 238s 238s # Reset the timeout for the recv() on the socket 238s read_timeout = timeout_obj.read_timeout 238s 238s if not conn.is_closed: 238s # In Python 3 socket.py will catch EAGAIN and return None when you 238s # try and read into the file pointer created by http.client, which 238s # instead raises a BadStatusLine exception. Instead of catching 238s # the exception and assuming all BadStatusLine exceptions are read 238s # timeouts, check for a zero timeout before making the request. 238s if read_timeout == 0: 238s raise ReadTimeoutError( 238s self, url, f"Read timed out. (read timeout={read_timeout})" 238s ) 238s conn.timeout = read_timeout 238s 238s # Receive the response from the server 238s try: 238s response = conn.getresponse() 238s except (BaseSSLError, OSError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 238s raise 238s 238s # Set properties that are used by the pooling layer. 238s response.retries = retries 238s response._connection = response_conn # type: ignore[attr-defined] 238s response._pool = self # type: ignore[attr-defined] 238s 238s log.debug( 238s '%s://%s:%s "%s %s %s" %s %s', 238s self.scheme, 238s self.host, 238s self.port, 238s method, 238s url, 238s > response.version_string, 238s response.status, 238s response.length_remaining, 238s ) 238s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 238s 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 238s ----------------------------- Captured stderr call ----------------------------- 238s 127.0.0.1 - - [18/Jan/2025 02:31:45] "GET /basic-auth/user/passwd HTTP/1.1" 200 46 238s ____________________________ test_auth_failed[http] ____________________________ 238s 238s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_auth_failed_http_0') 238s httpbin_both = 238s verify_pool_mgr = 238s 238s def test_auth_failed(tmpdir, httpbin_both, verify_pool_mgr): 238s """Ensure that we can save failed auth statuses""" 238s auth = ("user", "wrongwrongwrong") 238s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 238s url = httpbin_both.url + "/basic-auth/user/passwd" 238s with vcr.use_cassette(str(tmpdir.join("auth-failed.yaml"))) as cass: 238s # Ensure that this is empty to begin with 238s assert_cassette_empty(cass) 238s > one = verify_pool_mgr.request("GET", url, headers=headers) 238s 238s tests/integration/test_urllib3.py:83: 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 238s return self.request_encode_url( 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 238s return self.urlopen(method, url, **extra_kw) 238s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 238s response = conn.urlopen(method, u.request_uri, **kw) 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 238s response = self._make_request( 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s 238s self = 238s conn = 238s method = 'GET', url = '/basic-auth/user/passwd', body = None 238s headers = {'authorization': 'Basic dXNlcjp3cm9uZ3dyb25nd3Jvbmc='} 238s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 238s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 238s chunked = False, response_conn = None, preload_content = True 238s decode_content = True, enforce_content_length = True 238s 238s def _make_request( 238s self, 238s conn: BaseHTTPConnection, 238s method: str, 238s url: str, 238s body: _TYPE_BODY | None = None, 238s headers: typing.Mapping[str, str] | None = None, 238s retries: Retry | None = None, 238s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 238s chunked: bool = False, 238s response_conn: BaseHTTPConnection | None = None, 238s preload_content: bool = True, 238s decode_content: bool = True, 238s enforce_content_length: bool = True, 238s ) -> BaseHTTPResponse: 238s """ 238s Perform a request on a given urllib connection object taken from our 238s pool. 238s 238s :param conn: 238s a connection from one of our connection pools 238s 238s :param method: 238s HTTP request method (such as GET, POST, PUT, etc.) 238s 238s :param url: 238s The URL to perform the request on. 238s 238s :param body: 238s Data to send in the request body, either :class:`str`, :class:`bytes`, 238s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 238s 238s :param headers: 238s Dictionary of custom headers to send, such as User-Agent, 238s If-None-Match, etc. If None, pool headers are used. If provided, 238s these headers completely replace any pool-specific headers. 238s 238s :param retries: 238s Configure the number of retries to allow before raising a 238s :class:`~urllib3.exceptions.MaxRetryError` exception. 238s 238s Pass ``None`` to retry until you receive a response. Pass a 238s :class:`~urllib3.util.retry.Retry` object for fine-grained control 238s over different types of retries. 238s Pass an integer number to retry connection errors that many times, 238s but no other types of errors. Pass zero to never retry. 238s 238s If ``False``, then retries are disabled and any exception is raised 238s immediately. Also, instead of raising a MaxRetryError on redirects, 238s the redirect response will be returned. 238s 238s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 238s 238s :param timeout: 238s If specified, overrides the default timeout for this one 238s request. It may be a float (in seconds) or an instance of 238s :class:`urllib3.util.Timeout`. 238s 238s :param chunked: 238s If True, urllib3 will send the body using chunked transfer 238s encoding. Otherwise, urllib3 will send the body using the standard 238s content-length form. Defaults to False. 238s 238s :param response_conn: 238s Set this to ``None`` if you will handle releasing the connection or 238s set the connection to have the response release it. 238s 238s :param preload_content: 238s If True, the response's body will be preloaded during construction. 238s 238s :param decode_content: 238s If True, will attempt to decode the body based on the 238s 'content-encoding' header. 238s 238s :param enforce_content_length: 238s Enforce content length checking. Body returned by server must match 238s value of Content-Length header, if present. Otherwise, raise error. 238s """ 238s self.num_requests += 1 238s 238s timeout_obj = self._get_timeout(timeout) 238s timeout_obj.start_connect() 238s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 238s 238s try: 238s # Trigger any extra validation we need to do. 238s try: 238s self._validate_conn(conn) 238s except (SocketTimeout, BaseSSLError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 238s raise 238s 238s # _validate_conn() starts the connection to an HTTPS proxy 238s # so we need to wrap errors with 'ProxyError' here too. 238s except ( 238s OSError, 238s NewConnectionError, 238s TimeoutError, 238s BaseSSLError, 238s CertificateError, 238s SSLError, 238s ) as e: 238s new_e: Exception = e 238s if isinstance(e, (BaseSSLError, CertificateError)): 238s new_e = SSLError(e) 238s # If the connection didn't successfully connect to it's proxy 238s # then there 238s if isinstance( 238s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 238s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 238s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 238s raise new_e 238s 238s # conn.request() calls http.client.*.request, not the method in 238s # urllib3.request. It also calls makefile (recv) on the socket. 238s try: 238s conn.request( 238s method, 238s url, 238s body=body, 238s headers=headers, 238s chunked=chunked, 238s preload_content=preload_content, 238s decode_content=decode_content, 238s enforce_content_length=enforce_content_length, 238s ) 238s 238s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 238s # legitimately able to close the connection after sending a valid response. 238s # With this behaviour, the received response is still readable. 238s except BrokenPipeError: 238s pass 238s except OSError as e: 238s # MacOS/Linux 238s # EPROTOTYPE and ECONNRESET are needed on macOS 238s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 238s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 238s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 238s raise 238s 238s # Reset the timeout for the recv() on the socket 238s read_timeout = timeout_obj.read_timeout 238s 238s if not conn.is_closed: 238s # In Python 3 socket.py will catch EAGAIN and return None when you 238s # try and read into the file pointer created by http.client, which 238s # instead raises a BadStatusLine exception. Instead of catching 238s # the exception and assuming all BadStatusLine exceptions are read 238s # timeouts, check for a zero timeout before making the request. 238s if read_timeout == 0: 238s raise ReadTimeoutError( 238s self, url, f"Read timed out. (read timeout={read_timeout})" 238s ) 238s conn.timeout = read_timeout 238s 238s # Receive the response from the server 238s try: 238s response = conn.getresponse() 238s except (BaseSSLError, OSError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 238s raise 238s 238s # Set properties that are used by the pooling layer. 238s response.retries = retries 238s response._connection = response_conn # type: ignore[attr-defined] 238s response._pool = self # type: ignore[attr-defined] 238s 238s log.debug( 238s '%s://%s:%s "%s %s %s" %s %s', 238s self.scheme, 238s self.host, 238s self.port, 238s method, 238s url, 238s > response.version_string, 238s response.status, 238s response.length_remaining, 238s ) 238s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 238s 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 238s ----------------------------- Captured stderr call ----------------------------- 238s 127.0.0.1 - - [18/Jan/2025 02:31:45] "GET /basic-auth/user/passwd HTTP/1.1" 401 0 238s _______________________________ test_post[http] ________________________________ 238s 238s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_post_http_0') 238s httpbin_both = 238s verify_pool_mgr = 238s 238s def test_post(tmpdir, httpbin_both, verify_pool_mgr): 238s """Ensure that we can post and cache the results""" 238s data = {"key1": "value1", "key2": "value2"} 238s url = httpbin_both.url + "/post" 238s with vcr.use_cassette(str(tmpdir.join("verify_pool_mgr.yaml"))): 238s > req1 = verify_pool_mgr.request("POST", url, data).data 238s 238s tests/integration/test_urllib3.py:94: 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:143: in request 238s return self.request_encode_body( 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:278: in request_encode_body 238s return self.urlopen(method, url, **extra_kw) 238s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 238s response = conn.urlopen(method, u.request_uri, **kw) 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 238s response = self._make_request( 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s 238s self = 238s conn = 238s method = 'POST', url = '/post', body = {'key1': 'value1', 'key2': 'value2'} 238s headers = HTTPHeaderDict({}) 238s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 238s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 238s chunked = False, response_conn = None, preload_content = True 238s decode_content = True, enforce_content_length = True 238s 238s def _make_request( 238s self, 238s conn: BaseHTTPConnection, 238s method: str, 238s url: str, 238s body: _TYPE_BODY | None = None, 238s headers: typing.Mapping[str, str] | None = None, 238s retries: Retry | None = None, 238s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 238s chunked: bool = False, 238s response_conn: BaseHTTPConnection | None = None, 238s preload_content: bool = True, 238s decode_content: bool = True, 238s enforce_content_length: bool = True, 238s ) -> BaseHTTPResponse: 238s """ 238s Perform a request on a given urllib connection object taken from our 238s pool. 238s 238s :param conn: 238s a connection from one of our connection pools 238s 238s :param method: 238s HTTP request method (such as GET, POST, PUT, etc.) 238s 238s :param url: 238s The URL to perform the request on. 238s 238s :param body: 238s Data to send in the request body, either :class:`str`, :class:`bytes`, 238s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 238s 238s :param headers: 238s Dictionary of custom headers to send, such as User-Agent, 238s If-None-Match, etc. If None, pool headers are used. If provided, 238s these headers completely replace any pool-specific headers. 238s 238s :param retries: 238s Configure the number of retries to allow before raising a 238s :class:`~urllib3.exceptions.MaxRetryError` exception. 238s 238s Pass ``None`` to retry until you receive a response. Pass a 238s :class:`~urllib3.util.retry.Retry` object for fine-grained control 238s over different types of retries. 238s Pass an integer number to retry connection errors that many times, 238s but no other types of errors. Pass zero to never retry. 238s 238s If ``False``, then retries are disabled and any exception is raised 238s immediately. Also, instead of raising a MaxRetryError on redirects, 238s the redirect response will be returned. 238s 238s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 238s 238s :param timeout: 238s If specified, overrides the default timeout for this one 238s request. It may be a float (in seconds) or an instance of 238s :class:`urllib3.util.Timeout`. 238s 238s :param chunked: 238s If True, urllib3 will send the body using chunked transfer 238s encoding. Otherwise, urllib3 will send the body using the standard 238s content-length form. Defaults to False. 238s 238s :param response_conn: 238s Set this to ``None`` if you will handle releasing the connection or 238s set the connection to have the response release it. 238s 238s :param preload_content: 238s If True, the response's body will be preloaded during construction. 238s 238s :param decode_content: 238s If True, will attempt to decode the body based on the 238s 'content-encoding' header. 238s 238s :param enforce_content_length: 238s Enforce content length checking. Body returned by server must match 238s value of Content-Length header, if present. Otherwise, raise error. 238s """ 238s self.num_requests += 1 238s 238s timeout_obj = self._get_timeout(timeout) 238s timeout_obj.start_connect() 238s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 238s 238s try: 238s # Trigger any extra validation we need to do. 238s try: 238s self._validate_conn(conn) 238s except (SocketTimeout, BaseSSLError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 238s raise 238s 238s # _validate_conn() starts the connection to an HTTPS proxy 238s # so we need to wrap errors with 'ProxyError' here too. 238s except ( 238s OSError, 238s NewConnectionError, 238s TimeoutError, 238s BaseSSLError, 238s CertificateError, 238s SSLError, 238s ) as e: 238s new_e: Exception = e 238s if isinstance(e, (BaseSSLError, CertificateError)): 238s new_e = SSLError(e) 238s # If the connection didn't successfully connect to it's proxy 238s # then there 238s if isinstance( 238s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 238s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 238s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 238s raise new_e 238s 238s # conn.request() calls http.client.*.request, not the method in 238s # urllib3.request. It also calls makefile (recv) on the socket. 238s try: 238s conn.request( 238s method, 238s url, 238s body=body, 238s headers=headers, 238s chunked=chunked, 238s preload_content=preload_content, 238s decode_content=decode_content, 238s enforce_content_length=enforce_content_length, 238s ) 238s 238s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 238s # legitimately able to close the connection after sending a valid response. 238s # With this behaviour, the received response is still readable. 238s except BrokenPipeError: 238s pass 238s except OSError as e: 238s # MacOS/Linux 238s # EPROTOTYPE and ECONNRESET are needed on macOS 238s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 238s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 238s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 238s raise 238s 238s # Reset the timeout for the recv() on the socket 238s read_timeout = timeout_obj.read_timeout 238s 238s if not conn.is_closed: 238s # In Python 3 socket.py will catch EAGAIN and return None when you 238s # try and read into the file pointer created by http.client, which 238s # instead raises a BadStatusLine exception. Instead of catching 238s # the exception and assuming all BadStatusLine exceptions are read 238s # timeouts, check for a zero timeout before making the request. 238s if read_timeout == 0: 238s raise ReadTimeoutError( 238s self, url, f"Read timed out. (read timeout={read_timeout})" 238s ) 238s conn.timeout = read_timeout 238s 238s # Receive the response from the server 238s try: 238s response = conn.getresponse() 238s except (BaseSSLError, OSError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 238s raise 238s 238s # Set properties that are used by the pooling layer. 238s response.retries = retries 238s response._connection = response_conn # type: ignore[attr-defined] 238s response._pool = self # type: ignore[attr-defined] 238s 238s log.debug( 238s '%s://%s:%s "%s %s %s" %s %s', 238s self.scheme, 238s self.host, 238s self.port, 238s method, 238s url, 238s > response.version_string, 238s response.status, 238s response.length_remaining, 238s ) 238s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 238s 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 238s ----------------------------- Captured stderr call ----------------------------- 238s 127.0.0.1 - - [18/Jan/2025 02:31:45] "POST /post HTTP/1.1" 501 159 238s _______________________________ test_gzip[http] ________________________________ 238s 238s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_gzip_http_0') 238s httpbin_both = 238s verify_pool_mgr = 238s 238s def test_gzip(tmpdir, httpbin_both, verify_pool_mgr): 238s """ 238s Ensure that requests (actually urllib3) is able to automatically decompress 238s the response body 238s """ 238s url = httpbin_both.url + "/gzip" 238s response = verify_pool_mgr.request("GET", url) 238s 238s with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))): 238s > response = verify_pool_mgr.request("GET", url) 238s 238s tests/integration/test_urllib3.py:140: 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 238s return self.request_encode_url( 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 238s return self.urlopen(method, url, **extra_kw) 238s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 238s response = conn.urlopen(method, u.request_uri, **kw) 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 238s response = self._make_request( 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s 238s self = 238s conn = 238s method = 'GET', url = '/gzip', body = None, headers = {} 238s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 238s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 238s chunked = False, response_conn = None, preload_content = True 238s decode_content = True, enforce_content_length = True 238s 238s def _make_request( 238s self, 238s conn: BaseHTTPConnection, 238s method: str, 238s url: str, 238s body: _TYPE_BODY | None = None, 238s headers: typing.Mapping[str, str] | None = None, 238s retries: Retry | None = None, 238s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 238s chunked: bool = False, 238s response_conn: BaseHTTPConnection | None = None, 238s preload_content: bool = True, 238s decode_content: bool = True, 238s enforce_content_length: bool = True, 238s ) -> BaseHTTPResponse: 238s """ 238s Perform a request on a given urllib connection object taken from our 238s pool. 238s 238s :param conn: 238s a connection from one of our connection pools 238s 238s :param method: 238s HTTP request method (such as GET, POST, PUT, etc.) 238s 238s :param url: 238s The URL to perform the request on. 238s 238s :param body: 238s Data to send in the request body, either :class:`str`, :class:`bytes`, 238s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 238s 238s :param headers: 238s Dictionary of custom headers to send, such as User-Agent, 238s If-None-Match, etc. If None, pool headers are used. If provided, 238s these headers completely replace any pool-specific headers. 238s 238s :param retries: 238s Configure the number of retries to allow before raising a 238s :class:`~urllib3.exceptions.MaxRetryError` exception. 238s 238s Pass ``None`` to retry until you receive a response. Pass a 238s :class:`~urllib3.util.retry.Retry` object for fine-grained control 238s over different types of retries. 238s Pass an integer number to retry connection errors that many times, 238s but no other types of errors. Pass zero to never retry. 238s 238s If ``False``, then retries are disabled and any exception is raised 238s immediately. Also, instead of raising a MaxRetryError on redirects, 238s the redirect response will be returned. 238s 238s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 238s 238s :param timeout: 238s If specified, overrides the default timeout for this one 238s request. It may be a float (in seconds) or an instance of 238s :class:`urllib3.util.Timeout`. 238s 238s :param chunked: 238s If True, urllib3 will send the body using chunked transfer 238s encoding. Otherwise, urllib3 will send the body using the standard 238s content-length form. Defaults to False. 238s 238s :param response_conn: 238s Set this to ``None`` if you will handle releasing the connection or 238s set the connection to have the response release it. 238s 238s :param preload_content: 238s If True, the response's body will be preloaded during construction. 238s 238s :param decode_content: 238s If True, will attempt to decode the body based on the 238s 'content-encoding' header. 238s 238s :param enforce_content_length: 238s Enforce content length checking. Body returned by server must match 238s value of Content-Length header, if present. Otherwise, raise error. 238s """ 238s self.num_requests += 1 238s 238s timeout_obj = self._get_timeout(timeout) 238s timeout_obj.start_connect() 238s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 238s 238s try: 238s # Trigger any extra validation we need to do. 238s try: 238s self._validate_conn(conn) 238s except (SocketTimeout, BaseSSLError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 238s raise 238s 238s # _validate_conn() starts the connection to an HTTPS proxy 238s # so we need to wrap errors with 'ProxyError' here too. 238s except ( 238s OSError, 238s NewConnectionError, 238s TimeoutError, 238s BaseSSLError, 238s CertificateError, 238s SSLError, 238s ) as e: 238s new_e: Exception = e 238s if isinstance(e, (BaseSSLError, CertificateError)): 238s new_e = SSLError(e) 238s # If the connection didn't successfully connect to it's proxy 238s # then there 238s if isinstance( 238s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 238s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 238s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 238s raise new_e 238s 238s # conn.request() calls http.client.*.request, not the method in 238s # urllib3.request. It also calls makefile (recv) on the socket. 238s try: 238s conn.request( 238s method, 238s url, 238s body=body, 238s headers=headers, 238s chunked=chunked, 238s preload_content=preload_content, 238s decode_content=decode_content, 238s enforce_content_length=enforce_content_length, 238s ) 238s 238s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 238s # legitimately able to close the connection after sending a valid response. 238s # With this behaviour, the received response is still readable. 238s except BrokenPipeError: 238s pass 238s except OSError as e: 238s # MacOS/Linux 238s # EPROTOTYPE and ECONNRESET are needed on macOS 238s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 238s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 238s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 238s raise 238s 238s # Reset the timeout for the recv() on the socket 238s read_timeout = timeout_obj.read_timeout 238s 238s if not conn.is_closed: 238s # In Python 3 socket.py will catch EAGAIN and return None when you 238s # try and read into the file pointer created by http.client, which 238s # instead raises a BadStatusLine exception. Instead of catching 238s # the exception and assuming all BadStatusLine exceptions are read 238s # timeouts, check for a zero timeout before making the request. 238s if read_timeout == 0: 238s raise ReadTimeoutError( 238s self, url, f"Read timed out. (read timeout={read_timeout})" 238s ) 238s conn.timeout = read_timeout 238s 238s # Receive the response from the server 238s try: 238s response = conn.getresponse() 238s except (BaseSSLError, OSError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 238s raise 238s 238s # Set properties that are used by the pooling layer. 238s response.retries = retries 238s response._connection = response_conn # type: ignore[attr-defined] 238s response._pool = self # type: ignore[attr-defined] 238s 238s log.debug( 238s '%s://%s:%s "%s %s %s" %s %s', 238s self.scheme, 238s self.host, 238s self.port, 238s method, 238s url, 238s > response.version_string, 238s response.status, 238s response.length_remaining, 238s ) 238s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 238s 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 238s ----------------------------- Captured stderr call ----------------------------- 238s 127.0.0.1 - - [18/Jan/2025 02:31:45] "GET /gzip HTTP/1.1" 200 165 238s 127.0.0.1 - - [18/Jan/2025 02:31:45] "GET /gzip HTTP/1.1" 200 165 238s ___________________________ test_status_code[https] ____________________________ 238s 238s httpbin_both = 238s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_status_code_https_0') 238s verify_pool_mgr = 238s 238s def test_status_code(httpbin_both, tmpdir, verify_pool_mgr): 238s """Ensure that we can read the status code""" 238s url = httpbin_both.url 238s with vcr.use_cassette(str(tmpdir.join("atts.yaml"))): 238s > status_code = verify_pool_mgr.request("GET", url).status 238s 238s tests/integration/test_urllib3.py:34: 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 238s return self.request_encode_url( 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 238s return self.urlopen(method, url, **extra_kw) 238s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 238s response = conn.urlopen(method, u.request_uri, **kw) 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 238s response = self._make_request( 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s 238s self = 238s conn = 238s method = 'GET', url = '/', body = None, headers = {} 238s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 238s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 238s chunked = False, response_conn = None, preload_content = True 238s decode_content = True, enforce_content_length = True 238s 238s def _make_request( 238s self, 238s conn: BaseHTTPConnection, 238s method: str, 238s url: str, 238s body: _TYPE_BODY | None = None, 238s headers: typing.Mapping[str, str] | None = None, 238s retries: Retry | None = None, 238s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 238s chunked: bool = False, 238s response_conn: BaseHTTPConnection | None = None, 238s preload_content: bool = True, 238s decode_content: bool = True, 238s enforce_content_length: bool = True, 238s ) -> BaseHTTPResponse: 238s """ 238s Perform a request on a given urllib connection object taken from our 238s pool. 238s 238s :param conn: 238s a connection from one of our connection pools 238s 238s :param method: 238s HTTP request method (such as GET, POST, PUT, etc.) 238s 238s :param url: 238s The URL to perform the request on. 238s 238s :param body: 238s Data to send in the request body, either :class:`str`, :class:`bytes`, 238s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 238s 238s :param headers: 238s Dictionary of custom headers to send, such as User-Agent, 238s If-None-Match, etc. If None, pool headers are used. If provided, 238s these headers completely replace any pool-specific headers. 238s 238s :param retries: 238s Configure the number of retries to allow before raising a 238s :class:`~urllib3.exceptions.MaxRetryError` exception. 238s 238s Pass ``None`` to retry until you receive a response. Pass a 238s :class:`~urllib3.util.retry.Retry` object for fine-grained control 238s over different types of retries. 238s Pass an integer number to retry connection errors that many times, 238s but no other types of errors. Pass zero to never retry. 238s 238s If ``False``, then retries are disabled and any exception is raised 238s immediately. Also, instead of raising a MaxRetryError on redirects, 238s the redirect response will be returned. 238s 238s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 238s 238s :param timeout: 238s If specified, overrides the default timeout for this one 238s request. It may be a float (in seconds) or an instance of 238s :class:`urllib3.util.Timeout`. 238s 238s :param chunked: 238s If True, urllib3 will send the body using chunked transfer 238s encoding. Otherwise, urllib3 will send the body using the standard 238s content-length form. Defaults to False. 238s 238s :param response_conn: 238s Set this to ``None`` if you will handle releasing the connection or 238s set the connection to have the response release it. 238s 238s :param preload_content: 238s If True, the response's body will be preloaded during construction. 238s 238s :param decode_content: 238s If True, will attempt to decode the body based on the 238s 'content-encoding' header. 238s 238s :param enforce_content_length: 238s Enforce content length checking. Body returned by server must match 238s value of Content-Length header, if present. Otherwise, raise error. 238s """ 238s self.num_requests += 1 238s 238s timeout_obj = self._get_timeout(timeout) 238s timeout_obj.start_connect() 238s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 238s 238s try: 238s # Trigger any extra validation we need to do. 238s try: 238s self._validate_conn(conn) 238s except (SocketTimeout, BaseSSLError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 238s raise 238s 238s # _validate_conn() starts the connection to an HTTPS proxy 238s # so we need to wrap errors with 'ProxyError' here too. 238s except ( 238s OSError, 238s NewConnectionError, 238s TimeoutError, 238s BaseSSLError, 238s CertificateError, 238s SSLError, 238s ) as e: 238s new_e: Exception = e 238s if isinstance(e, (BaseSSLError, CertificateError)): 238s new_e = SSLError(e) 238s # If the connection didn't successfully connect to it's proxy 238s # then there 238s if isinstance( 238s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 238s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 238s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 238s raise new_e 238s 238s # conn.request() calls http.client.*.request, not the method in 238s # urllib3.request. It also calls makefile (recv) on the socket. 238s try: 238s conn.request( 238s method, 238s url, 238s body=body, 238s headers=headers, 238s chunked=chunked, 238s preload_content=preload_content, 238s decode_content=decode_content, 238s enforce_content_length=enforce_content_length, 238s ) 238s 238s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 238s # legitimately able to close the connection after sending a valid response. 238s # With this behaviour, the received response is still readable. 238s except BrokenPipeError: 238s pass 238s except OSError as e: 238s # MacOS/Linux 238s # EPROTOTYPE and ECONNRESET are needed on macOS 238s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 238s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 238s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 238s raise 238s 238s # Reset the timeout for the recv() on the socket 238s read_timeout = timeout_obj.read_timeout 238s 238s if not conn.is_closed: 238s # In Python 3 socket.py will catch EAGAIN and return None when you 238s # try and read into the file pointer created by http.client, which 238s # instead raises a BadStatusLine exception. Instead of catching 238s # the exception and assuming all BadStatusLine exceptions are read 238s # timeouts, check for a zero timeout before making the request. 238s if read_timeout == 0: 238s raise ReadTimeoutError( 238s self, url, f"Read timed out. (read timeout={read_timeout})" 238s ) 238s conn.timeout = read_timeout 238s 238s # Receive the response from the server 238s try: 238s response = conn.getresponse() 238s except (BaseSSLError, OSError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 238s raise 238s 238s # Set properties that are used by the pooling layer. 238s response.retries = retries 238s response._connection = response_conn # type: ignore[attr-defined] 238s response._pool = self # type: ignore[attr-defined] 238s 238s log.debug( 238s '%s://%s:%s "%s %s %s" %s %s', 238s self.scheme, 238s self.host, 238s self.port, 238s method, 238s url, 238s > response.version_string, 238s response.status, 238s response.length_remaining, 238s ) 238s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 238s 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 238s ----------------------------- Captured stderr call ----------------------------- 238s 127.0.0.1 - - [18/Jan/2025 02:31:45] "GET / HTTP/1.1" 200 9358 238s _____________________________ test_headers[https] ______________________________ 238s 238s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_headers_https_0') 238s httpbin_both = 238s verify_pool_mgr = 238s 238s def test_headers(tmpdir, httpbin_both, verify_pool_mgr): 238s """Ensure that we can read the headers back""" 238s url = httpbin_both.url 238s with vcr.use_cassette(str(tmpdir.join("headers.yaml"))): 238s > headers = verify_pool_mgr.request("GET", url).headers 238s 238s tests/integration/test_urllib3.py:44: 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 238s return self.request_encode_url( 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 238s return self.urlopen(method, url, **extra_kw) 238s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 238s response = conn.urlopen(method, u.request_uri, **kw) 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 238s response = self._make_request( 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s 238s self = 238s conn = 238s method = 'GET', url = '/', body = None, headers = {} 238s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 238s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 238s chunked = False, response_conn = None, preload_content = True 238s decode_content = True, enforce_content_length = True 238s 238s def _make_request( 238s self, 238s conn: BaseHTTPConnection, 238s method: str, 238s url: str, 238s body: _TYPE_BODY | None = None, 238s headers: typing.Mapping[str, str] | None = None, 238s retries: Retry | None = None, 238s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 238s chunked: bool = False, 238s response_conn: BaseHTTPConnection | None = None, 238s preload_content: bool = True, 238s decode_content: bool = True, 238s enforce_content_length: bool = True, 238s ) -> BaseHTTPResponse: 238s """ 238s Perform a request on a given urllib connection object taken from our 238s pool. 238s 238s :param conn: 238s a connection from one of our connection pools 238s 238s :param method: 238s HTTP request method (such as GET, POST, PUT, etc.) 238s 238s :param url: 238s The URL to perform the request on. 238s 238s :param body: 238s Data to send in the request body, either :class:`str`, :class:`bytes`, 238s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 238s 238s :param headers: 238s Dictionary of custom headers to send, such as User-Agent, 238s If-None-Match, etc. If None, pool headers are used. If provided, 238s these headers completely replace any pool-specific headers. 238s 238s :param retries: 238s Configure the number of retries to allow before raising a 238s :class:`~urllib3.exceptions.MaxRetryError` exception. 238s 238s Pass ``None`` to retry until you receive a response. Pass a 238s :class:`~urllib3.util.retry.Retry` object for fine-grained control 238s over different types of retries. 238s Pass an integer number to retry connection errors that many times, 238s but no other types of errors. Pass zero to never retry. 238s 238s If ``False``, then retries are disabled and any exception is raised 238s immediately. Also, instead of raising a MaxRetryError on redirects, 238s the redirect response will be returned. 238s 238s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 238s 238s :param timeout: 238s If specified, overrides the default timeout for this one 238s request. It may be a float (in seconds) or an instance of 238s :class:`urllib3.util.Timeout`. 238s 238s :param chunked: 238s If True, urllib3 will send the body using chunked transfer 238s encoding. Otherwise, urllib3 will send the body using the standard 238s content-length form. Defaults to False. 238s 238s :param response_conn: 238s Set this to ``None`` if you will handle releasing the connection or 238s set the connection to have the response release it. 238s 238s :param preload_content: 238s If True, the response's body will be preloaded during construction. 238s 238s :param decode_content: 238s If True, will attempt to decode the body based on the 238s 'content-encoding' header. 238s 238s :param enforce_content_length: 238s Enforce content length checking. Body returned by server must match 238s value of Content-Length header, if present. Otherwise, raise error. 238s """ 238s self.num_requests += 1 238s 238s timeout_obj = self._get_timeout(timeout) 238s timeout_obj.start_connect() 238s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 238s 238s try: 238s # Trigger any extra validation we need to do. 238s try: 238s self._validate_conn(conn) 238s except (SocketTimeout, BaseSSLError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 238s raise 238s 238s # _validate_conn() starts the connection to an HTTPS proxy 238s # so we need to wrap errors with 'ProxyError' here too. 238s except ( 238s OSError, 238s NewConnectionError, 238s TimeoutError, 238s BaseSSLError, 238s CertificateError, 238s SSLError, 238s ) as e: 238s new_e: Exception = e 238s if isinstance(e, (BaseSSLError, CertificateError)): 238s new_e = SSLError(e) 238s # If the connection didn't successfully connect to it's proxy 238s # then there 238s if isinstance( 238s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 238s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 238s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 238s raise new_e 238s 238s # conn.request() calls http.client.*.request, not the method in 238s # urllib3.request. It also calls makefile (recv) on the socket. 238s try: 238s conn.request( 238s method, 238s url, 238s body=body, 238s headers=headers, 238s chunked=chunked, 238s preload_content=preload_content, 238s decode_content=decode_content, 238s enforce_content_length=enforce_content_length, 238s ) 238s 238s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 238s # legitimately able to close the connection after sending a valid response. 238s # With this behaviour, the received response is still readable. 238s except BrokenPipeError: 238s pass 238s except OSError as e: 238s # MacOS/Linux 238s # EPROTOTYPE and ECONNRESET are needed on macOS 238s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 238s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 238s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 238s raise 238s 238s # Reset the timeout for the recv() on the socket 238s read_timeout = timeout_obj.read_timeout 238s 238s if not conn.is_closed: 238s # In Python 3 socket.py will catch EAGAIN and return None when you 238s # try and read into the file pointer created by http.client, which 238s # instead raises a BadStatusLine exception. Instead of catching 238s # the exception and assuming all BadStatusLine exceptions are read 238s # timeouts, check for a zero timeout before making the request. 238s if read_timeout == 0: 238s raise ReadTimeoutError( 238s self, url, f"Read timed out. (read timeout={read_timeout})" 238s ) 238s conn.timeout = read_timeout 238s 238s # Receive the response from the server 238s try: 238s response = conn.getresponse() 238s except (BaseSSLError, OSError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 238s raise 238s 238s # Set properties that are used by the pooling layer. 238s response.retries = retries 238s response._connection = response_conn # type: ignore[attr-defined] 238s response._pool = self # type: ignore[attr-defined] 238s 238s log.debug( 238s '%s://%s:%s "%s %s %s" %s %s', 238s self.scheme, 238s self.host, 238s self.port, 238s method, 238s url, 238s > response.version_string, 238s response.status, 238s response.length_remaining, 238s ) 238s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 238s 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 238s ----------------------------- Captured stderr call ----------------------------- 238s 127.0.0.1 - - [18/Jan/2025 02:31:46] "GET / HTTP/1.1" 200 9358 238s _______________________________ test_body[https] _______________________________ 238s 238s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_body_https_0') 238s httpbin_both = 238s verify_pool_mgr = 238s 238s def test_body(tmpdir, httpbin_both, verify_pool_mgr): 238s """Ensure the responses are all identical enough""" 238s url = httpbin_both.url + "/bytes/1024" 238s with vcr.use_cassette(str(tmpdir.join("body.yaml"))): 238s > content = verify_pool_mgr.request("GET", url).data 238s 238s tests/integration/test_urllib3.py:55: 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 238s return self.request_encode_url( 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 238s return self.urlopen(method, url, **extra_kw) 238s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 238s response = conn.urlopen(method, u.request_uri, **kw) 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 238s response = self._make_request( 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s 238s self = 238s conn = 238s method = 'GET', url = '/bytes/1024', body = None, headers = {} 238s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 238s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 238s chunked = False, response_conn = None, preload_content = True 238s decode_content = True, enforce_content_length = True 238s 238s def _make_request( 238s self, 238s conn: BaseHTTPConnection, 238s method: str, 238s url: str, 238s body: _TYPE_BODY | None = None, 238s headers: typing.Mapping[str, str] | None = None, 238s retries: Retry | None = None, 238s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 238s chunked: bool = False, 238s response_conn: BaseHTTPConnection | None = None, 238s preload_content: bool = True, 238s decode_content: bool = True, 238s enforce_content_length: bool = True, 238s ) -> BaseHTTPResponse: 238s """ 238s Perform a request on a given urllib connection object taken from our 238s pool. 238s 238s :param conn: 238s a connection from one of our connection pools 238s 238s :param method: 238s HTTP request method (such as GET, POST, PUT, etc.) 238s 238s :param url: 238s The URL to perform the request on. 238s 238s :param body: 238s Data to send in the request body, either :class:`str`, :class:`bytes`, 238s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 238s 238s :param headers: 238s Dictionary of custom headers to send, such as User-Agent, 238s If-None-Match, etc. If None, pool headers are used. If provided, 238s these headers completely replace any pool-specific headers. 238s 238s :param retries: 238s Configure the number of retries to allow before raising a 238s :class:`~urllib3.exceptions.MaxRetryError` exception. 238s 238s Pass ``None`` to retry until you receive a response. Pass a 238s :class:`~urllib3.util.retry.Retry` object for fine-grained control 238s over different types of retries. 238s Pass an integer number to retry connection errors that many times, 238s but no other types of errors. Pass zero to never retry. 238s 238s If ``False``, then retries are disabled and any exception is raised 238s immediately. Also, instead of raising a MaxRetryError on redirects, 238s the redirect response will be returned. 238s 238s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 238s 238s :param timeout: 238s If specified, overrides the default timeout for this one 238s request. It may be a float (in seconds) or an instance of 238s :class:`urllib3.util.Timeout`. 238s 238s :param chunked: 238s If True, urllib3 will send the body using chunked transfer 238s encoding. Otherwise, urllib3 will send the body using the standard 238s content-length form. Defaults to False. 238s 238s :param response_conn: 238s Set this to ``None`` if you will handle releasing the connection or 238s set the connection to have the response release it. 238s 238s :param preload_content: 238s If True, the response's body will be preloaded during construction. 238s 238s :param decode_content: 238s If True, will attempt to decode the body based on the 238s 'content-encoding' header. 238s 238s :param enforce_content_length: 238s Enforce content length checking. Body returned by server must match 238s value of Content-Length header, if present. Otherwise, raise error. 238s """ 238s self.num_requests += 1 238s 238s timeout_obj = self._get_timeout(timeout) 238s timeout_obj.start_connect() 238s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 238s 238s try: 238s # Trigger any extra validation we need to do. 238s try: 238s self._validate_conn(conn) 238s except (SocketTimeout, BaseSSLError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 238s raise 238s 238s # _validate_conn() starts the connection to an HTTPS proxy 238s # so we need to wrap errors with 'ProxyError' here too. 238s except ( 238s OSError, 238s NewConnectionError, 238s TimeoutError, 238s BaseSSLError, 238s CertificateError, 238s SSLError, 238s ) as e: 238s new_e: Exception = e 238s if isinstance(e, (BaseSSLError, CertificateError)): 238s new_e = SSLError(e) 238s # If the connection didn't successfully connect to it's proxy 238s # then there 238s if isinstance( 238s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 238s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 238s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 238s raise new_e 238s 238s # conn.request() calls http.client.*.request, not the method in 238s # urllib3.request. It also calls makefile (recv) on the socket. 238s try: 238s conn.request( 238s method, 238s url, 238s body=body, 238s headers=headers, 238s chunked=chunked, 238s preload_content=preload_content, 238s decode_content=decode_content, 238s enforce_content_length=enforce_content_length, 238s ) 238s 238s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 238s # legitimately able to close the connection after sending a valid response. 238s # With this behaviour, the received response is still readable. 238s except BrokenPipeError: 238s pass 238s except OSError as e: 238s # MacOS/Linux 238s # EPROTOTYPE and ECONNRESET are needed on macOS 238s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 238s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 238s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 238s raise 238s 238s # Reset the timeout for the recv() on the socket 238s read_timeout = timeout_obj.read_timeout 238s 238s if not conn.is_closed: 238s # In Python 3 socket.py will catch EAGAIN and return None when you 238s # try and read into the file pointer created by http.client, which 238s # instead raises a BadStatusLine exception. Instead of catching 238s # the exception and assuming all BadStatusLine exceptions are read 238s # timeouts, check for a zero timeout before making the request. 238s if read_timeout == 0: 238s raise ReadTimeoutError( 238s self, url, f"Read timed out. (read timeout={read_timeout})" 238s ) 238s conn.timeout = read_timeout 238s 238s # Receive the response from the server 238s try: 238s response = conn.getresponse() 238s except (BaseSSLError, OSError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 238s raise 238s 238s # Set properties that are used by the pooling layer. 238s response.retries = retries 238s response._connection = response_conn # type: ignore[attr-defined] 238s response._pool = self # type: ignore[attr-defined] 238s 238s log.debug( 238s '%s://%s:%s "%s %s %s" %s %s', 238s self.scheme, 238s self.host, 238s self.port, 238s method, 238s url, 238s > response.version_string, 238s response.status, 238s response.length_remaining, 238s ) 238s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 238s 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 238s ----------------------------- Captured stderr call ----------------------------- 238s 127.0.0.1 - - [18/Jan/2025 02:31:46] "GET /bytes/1024 HTTP/1.1" 200 1024 238s _______________________________ test_auth[https] _______________________________ 238s 238s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_auth_https_0') 238s httpbin_both = 238s verify_pool_mgr = 238s 238s def test_auth(tmpdir, httpbin_both, verify_pool_mgr): 238s """Ensure that we can handle basic auth""" 238s auth = ("user", "passwd") 238s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 238s url = httpbin_both.url + "/basic-auth/user/passwd" 238s with vcr.use_cassette(str(tmpdir.join("auth.yaml"))): 238s > one = verify_pool_mgr.request("GET", url, headers=headers) 238s 238s tests/integration/test_urllib3.py:67: 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 238s return self.request_encode_url( 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 238s return self.urlopen(method, url, **extra_kw) 238s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 238s response = conn.urlopen(method, u.request_uri, **kw) 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 238s response = self._make_request( 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s 238s self = 238s conn = 238s method = 'GET', url = '/basic-auth/user/passwd', body = None 238s headers = {'authorization': 'Basic dXNlcjpwYXNzd2Q='} 238s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 238s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 238s chunked = False, response_conn = None, preload_content = True 238s decode_content = True, enforce_content_length = True 238s 238s def _make_request( 238s self, 238s conn: BaseHTTPConnection, 238s method: str, 238s url: str, 238s body: _TYPE_BODY | None = None, 238s headers: typing.Mapping[str, str] | None = None, 238s retries: Retry | None = None, 238s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 238s chunked: bool = False, 238s response_conn: BaseHTTPConnection | None = None, 238s preload_content: bool = True, 238s decode_content: bool = True, 238s enforce_content_length: bool = True, 238s ) -> BaseHTTPResponse: 238s """ 238s Perform a request on a given urllib connection object taken from our 238s pool. 238s 238s :param conn: 238s a connection from one of our connection pools 238s 238s :param method: 238s HTTP request method (such as GET, POST, PUT, etc.) 238s 238s :param url: 238s The URL to perform the request on. 238s 238s :param body: 238s Data to send in the request body, either :class:`str`, :class:`bytes`, 238s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 238s 238s :param headers: 238s Dictionary of custom headers to send, such as User-Agent, 238s If-None-Match, etc. If None, pool headers are used. If provided, 238s these headers completely replace any pool-specific headers. 238s 238s :param retries: 238s Configure the number of retries to allow before raising a 238s :class:`~urllib3.exceptions.MaxRetryError` exception. 238s 238s Pass ``None`` to retry until you receive a response. Pass a 238s :class:`~urllib3.util.retry.Retry` object for fine-grained control 238s over different types of retries. 238s Pass an integer number to retry connection errors that many times, 238s but no other types of errors. Pass zero to never retry. 238s 238s If ``False``, then retries are disabled and any exception is raised 238s immediately. Also, instead of raising a MaxRetryError on redirects, 238s the redirect response will be returned. 238s 238s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 238s 238s :param timeout: 238s If specified, overrides the default timeout for this one 238s request. It may be a float (in seconds) or an instance of 238s :class:`urllib3.util.Timeout`. 238s 238s :param chunked: 238s If True, urllib3 will send the body using chunked transfer 238s encoding. Otherwise, urllib3 will send the body using the standard 238s content-length form. Defaults to False. 238s 238s :param response_conn: 238s Set this to ``None`` if you will handle releasing the connection or 238s set the connection to have the response release it. 238s 238s :param preload_content: 238s If True, the response's body will be preloaded during construction. 238s 238s :param decode_content: 238s If True, will attempt to decode the body based on the 238s 'content-encoding' header. 238s 238s :param enforce_content_length: 238s Enforce content length checking. Body returned by server must match 238s value of Content-Length header, if present. Otherwise, raise error. 238s """ 238s self.num_requests += 1 238s 238s timeout_obj = self._get_timeout(timeout) 238s timeout_obj.start_connect() 238s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 238s 238s try: 238s # Trigger any extra validation we need to do. 238s try: 238s self._validate_conn(conn) 238s except (SocketTimeout, BaseSSLError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 238s raise 238s 238s # _validate_conn() starts the connection to an HTTPS proxy 238s # so we need to wrap errors with 'ProxyError' here too. 238s except ( 238s OSError, 238s NewConnectionError, 238s TimeoutError, 238s BaseSSLError, 238s CertificateError, 238s SSLError, 238s ) as e: 238s new_e: Exception = e 238s if isinstance(e, (BaseSSLError, CertificateError)): 238s new_e = SSLError(e) 238s # If the connection didn't successfully connect to it's proxy 238s # then there 238s if isinstance( 238s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 238s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 238s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 238s raise new_e 238s 238s # conn.request() calls http.client.*.request, not the method in 238s # urllib3.request. It also calls makefile (recv) on the socket. 238s try: 238s conn.request( 238s method, 238s url, 238s body=body, 238s headers=headers, 238s chunked=chunked, 238s preload_content=preload_content, 238s decode_content=decode_content, 238s enforce_content_length=enforce_content_length, 238s ) 238s 238s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 238s # legitimately able to close the connection after sending a valid response. 238s # With this behaviour, the received response is still readable. 238s except BrokenPipeError: 238s pass 238s except OSError as e: 238s # MacOS/Linux 238s # EPROTOTYPE and ECONNRESET are needed on macOS 238s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 238s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 238s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 238s raise 238s 238s # Reset the timeout for the recv() on the socket 238s read_timeout = timeout_obj.read_timeout 238s 238s if not conn.is_closed: 238s # In Python 3 socket.py will catch EAGAIN and return None when you 238s # try and read into the file pointer created by http.client, which 238s # instead raises a BadStatusLine exception. Instead of catching 238s # the exception and assuming all BadStatusLine exceptions are read 238s # timeouts, check for a zero timeout before making the request. 238s if read_timeout == 0: 238s raise ReadTimeoutError( 238s self, url, f"Read timed out. (read timeout={read_timeout})" 238s ) 238s conn.timeout = read_timeout 238s 238s # Receive the response from the server 238s try: 238s response = conn.getresponse() 238s except (BaseSSLError, OSError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 238s raise 238s 238s # Set properties that are used by the pooling layer. 238s response.retries = retries 238s response._connection = response_conn # type: ignore[attr-defined] 238s response._pool = self # type: ignore[attr-defined] 238s 238s log.debug( 238s '%s://%s:%s "%s %s %s" %s %s', 238s self.scheme, 238s self.host, 238s self.port, 238s method, 238s url, 238s > response.version_string, 238s response.status, 238s response.length_remaining, 238s ) 238s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 238s 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 238s ----------------------------- Captured stderr call ----------------------------- 238s 127.0.0.1 - - [18/Jan/2025 02:31:46] "GET /basic-auth/user/passwd HTTP/1.1" 200 46 238s ___________________________ test_auth_failed[https] ____________________________ 238s 238s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_auth_failed_https_0') 238s httpbin_both = 238s verify_pool_mgr = 238s 238s def test_auth_failed(tmpdir, httpbin_both, verify_pool_mgr): 238s """Ensure that we can save failed auth statuses""" 238s auth = ("user", "wrongwrongwrong") 238s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 238s url = httpbin_both.url + "/basic-auth/user/passwd" 238s with vcr.use_cassette(str(tmpdir.join("auth-failed.yaml"))) as cass: 238s # Ensure that this is empty to begin with 238s assert_cassette_empty(cass) 238s > one = verify_pool_mgr.request("GET", url, headers=headers) 238s 238s tests/integration/test_urllib3.py:83: 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 238s return self.request_encode_url( 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 238s return self.urlopen(method, url, **extra_kw) 238s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 238s response = conn.urlopen(method, u.request_uri, **kw) 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 238s response = self._make_request( 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s 238s self = 238s conn = 238s method = 'GET', url = '/basic-auth/user/passwd', body = None 238s headers = {'authorization': 'Basic dXNlcjp3cm9uZ3dyb25nd3Jvbmc='} 238s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 238s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 238s chunked = False, response_conn = None, preload_content = True 238s decode_content = True, enforce_content_length = True 238s 238s def _make_request( 238s self, 238s conn: BaseHTTPConnection, 238s method: str, 238s url: str, 238s body: _TYPE_BODY | None = None, 238s headers: typing.Mapping[str, str] | None = None, 238s retries: Retry | None = None, 238s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 238s chunked: bool = False, 238s response_conn: BaseHTTPConnection | None = None, 238s preload_content: bool = True, 238s decode_content: bool = True, 238s enforce_content_length: bool = True, 238s ) -> BaseHTTPResponse: 238s """ 238s Perform a request on a given urllib connection object taken from our 238s pool. 238s 238s :param conn: 238s a connection from one of our connection pools 238s 238s :param method: 238s HTTP request method (such as GET, POST, PUT, etc.) 238s 238s :param url: 238s The URL to perform the request on. 238s 238s :param body: 238s Data to send in the request body, either :class:`str`, :class:`bytes`, 238s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 238s 238s :param headers: 238s Dictionary of custom headers to send, such as User-Agent, 238s If-None-Match, etc. If None, pool headers are used. If provided, 238s these headers completely replace any pool-specific headers. 238s 238s :param retries: 238s Configure the number of retries to allow before raising a 238s :class:`~urllib3.exceptions.MaxRetryError` exception. 238s 238s Pass ``None`` to retry until you receive a response. Pass a 238s :class:`~urllib3.util.retry.Retry` object for fine-grained control 238s over different types of retries. 238s Pass an integer number to retry connection errors that many times, 238s but no other types of errors. Pass zero to never retry. 238s 238s If ``False``, then retries are disabled and any exception is raised 238s immediately. Also, instead of raising a MaxRetryError on redirects, 238s the redirect response will be returned. 238s 238s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 238s 238s :param timeout: 238s If specified, overrides the default timeout for this one 238s request. It may be a float (in seconds) or an instance of 238s :class:`urllib3.util.Timeout`. 238s 238s :param chunked: 238s If True, urllib3 will send the body using chunked transfer 238s encoding. Otherwise, urllib3 will send the body using the standard 238s content-length form. Defaults to False. 238s 238s :param response_conn: 238s Set this to ``None`` if you will handle releasing the connection or 238s set the connection to have the response release it. 238s 238s :param preload_content: 238s If True, the response's body will be preloaded during construction. 238s 238s :param decode_content: 238s If True, will attempt to decode the body based on the 238s 'content-encoding' header. 238s 238s :param enforce_content_length: 238s Enforce content length checking. Body returned by server must match 238s value of Content-Length header, if present. Otherwise, raise error. 238s """ 238s self.num_requests += 1 238s 238s timeout_obj = self._get_timeout(timeout) 238s timeout_obj.start_connect() 238s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 238s 238s try: 238s # Trigger any extra validation we need to do. 238s try: 238s self._validate_conn(conn) 238s except (SocketTimeout, BaseSSLError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 238s raise 238s 238s # _validate_conn() starts the connection to an HTTPS proxy 238s # so we need to wrap errors with 'ProxyError' here too. 238s except ( 238s OSError, 238s NewConnectionError, 238s TimeoutError, 238s BaseSSLError, 238s CertificateError, 238s SSLError, 238s ) as e: 238s new_e: Exception = e 238s if isinstance(e, (BaseSSLError, CertificateError)): 238s new_e = SSLError(e) 238s # If the connection didn't successfully connect to it's proxy 238s # then there 238s if isinstance( 238s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 238s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 238s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 238s raise new_e 238s 238s # conn.request() calls http.client.*.request, not the method in 238s # urllib3.request. It also calls makefile (recv) on the socket. 238s try: 238s conn.request( 238s method, 238s url, 238s body=body, 238s headers=headers, 238s chunked=chunked, 238s preload_content=preload_content, 238s decode_content=decode_content, 238s enforce_content_length=enforce_content_length, 238s ) 238s 238s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 238s # legitimately able to close the connection after sending a valid response. 238s # With this behaviour, the received response is still readable. 238s except BrokenPipeError: 238s pass 238s except OSError as e: 238s # MacOS/Linux 238s # EPROTOTYPE and ECONNRESET are needed on macOS 238s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 238s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 238s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 238s raise 238s 238s # Reset the timeout for the recv() on the socket 238s read_timeout = timeout_obj.read_timeout 238s 238s if not conn.is_closed: 238s # In Python 3 socket.py will catch EAGAIN and return None when you 238s # try and read into the file pointer created by http.client, which 238s # instead raises a BadStatusLine exception. Instead of catching 238s # the exception and assuming all BadStatusLine exceptions are read 238s # timeouts, check for a zero timeout before making the request. 238s if read_timeout == 0: 238s raise ReadTimeoutError( 238s self, url, f"Read timed out. (read timeout={read_timeout})" 238s ) 238s conn.timeout = read_timeout 238s 238s # Receive the response from the server 238s try: 238s response = conn.getresponse() 238s except (BaseSSLError, OSError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 238s raise 238s 238s # Set properties that are used by the pooling layer. 238s response.retries = retries 238s response._connection = response_conn # type: ignore[attr-defined] 238s response._pool = self # type: ignore[attr-defined] 238s 238s log.debug( 238s '%s://%s:%s "%s %s %s" %s %s', 238s self.scheme, 238s self.host, 238s self.port, 238s method, 238s url, 238s > response.version_string, 238s response.status, 238s response.length_remaining, 238s ) 238s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 238s 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 238s ----------------------------- Captured stderr call ----------------------------- 238s 127.0.0.1 - - [18/Jan/2025 02:31:46] "GET /basic-auth/user/passwd HTTP/1.1" 401 0 238s _______________________________ test_post[https] _______________________________ 238s 238s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_post_https_0') 238s httpbin_both = 238s verify_pool_mgr = 238s 238s def test_post(tmpdir, httpbin_both, verify_pool_mgr): 238s """Ensure that we can post and cache the results""" 238s data = {"key1": "value1", "key2": "value2"} 238s url = httpbin_both.url + "/post" 238s with vcr.use_cassette(str(tmpdir.join("verify_pool_mgr.yaml"))): 238s > req1 = verify_pool_mgr.request("POST", url, data).data 238s 238s tests/integration/test_urllib3.py:94: 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:143: in request 238s return self.request_encode_body( 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:278: in request_encode_body 238s return self.urlopen(method, url, **extra_kw) 238s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 238s response = conn.urlopen(method, u.request_uri, **kw) 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 238s response = self._make_request( 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s 238s self = 238s conn = 238s method = 'POST', url = '/post', body = {'key1': 'value1', 'key2': 'value2'} 238s headers = HTTPHeaderDict({}) 238s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 238s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 238s chunked = False, response_conn = None, preload_content = True 238s decode_content = True, enforce_content_length = True 238s 238s def _make_request( 238s self, 238s conn: BaseHTTPConnection, 238s method: str, 238s url: str, 238s body: _TYPE_BODY | None = None, 238s headers: typing.Mapping[str, str] | None = None, 238s retries: Retry | None = None, 238s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 238s chunked: bool = False, 238s response_conn: BaseHTTPConnection | None = None, 238s preload_content: bool = True, 238s decode_content: bool = True, 238s enforce_content_length: bool = True, 238s ) -> BaseHTTPResponse: 238s """ 238s Perform a request on a given urllib connection object taken from our 238s pool. 238s 238s :param conn: 238s a connection from one of our connection pools 238s 238s :param method: 238s HTTP request method (such as GET, POST, PUT, etc.) 238s 238s :param url: 238s The URL to perform the request on. 238s 238s :param body: 238s Data to send in the request body, either :class:`str`, :class:`bytes`, 238s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 238s 238s :param headers: 238s Dictionary of custom headers to send, such as User-Agent, 238s If-None-Match, etc. If None, pool headers are used. If provided, 238s these headers completely replace any pool-specific headers. 238s 238s :param retries: 238s Configure the number of retries to allow before raising a 238s :class:`~urllib3.exceptions.MaxRetryError` exception. 238s 238s Pass ``None`` to retry until you receive a response. Pass a 238s :class:`~urllib3.util.retry.Retry` object for fine-grained control 238s over different types of retries. 238s Pass an integer number to retry connection errors that many times, 238s but no other types of errors. Pass zero to never retry. 238s 238s If ``False``, then retries are disabled and any exception is raised 238s immediately. Also, instead of raising a MaxRetryError on redirects, 238s the redirect response will be returned. 238s 238s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 238s 238s :param timeout: 238s If specified, overrides the default timeout for this one 238s request. It may be a float (in seconds) or an instance of 238s :class:`urllib3.util.Timeout`. 238s 238s :param chunked: 238s If True, urllib3 will send the body using chunked transfer 238s encoding. Otherwise, urllib3 will send the body using the standard 238s content-length form. Defaults to False. 238s 238s :param response_conn: 238s Set this to ``None`` if you will handle releasing the connection or 238s set the connection to have the response release it. 238s 238s :param preload_content: 238s If True, the response's body will be preloaded during construction. 238s 238s :param decode_content: 238s If True, will attempt to decode the body based on the 238s 'content-encoding' header. 238s 238s :param enforce_content_length: 238s Enforce content length checking. Body returned by server must match 238s value of Content-Length header, if present. Otherwise, raise error. 238s """ 238s self.num_requests += 1 238s 238s timeout_obj = self._get_timeout(timeout) 238s timeout_obj.start_connect() 238s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 238s 238s try: 238s # Trigger any extra validation we need to do. 238s try: 238s self._validate_conn(conn) 238s except (SocketTimeout, BaseSSLError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 238s raise 238s 238s # _validate_conn() starts the connection to an HTTPS proxy 238s # so we need to wrap errors with 'ProxyError' here too. 238s except ( 238s OSError, 238s NewConnectionError, 238s TimeoutError, 238s BaseSSLError, 238s CertificateError, 238s SSLError, 238s ) as e: 238s new_e: Exception = e 238s if isinstance(e, (BaseSSLError, CertificateError)): 238s new_e = SSLError(e) 238s # If the connection didn't successfully connect to it's proxy 238s # then there 238s if isinstance( 238s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 238s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 238s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 238s raise new_e 238s 238s # conn.request() calls http.client.*.request, not the method in 238s # urllib3.request. It also calls makefile (recv) on the socket. 238s try: 238s conn.request( 238s method, 238s url, 238s body=body, 238s headers=headers, 238s chunked=chunked, 238s preload_content=preload_content, 238s decode_content=decode_content, 238s enforce_content_length=enforce_content_length, 238s ) 238s 238s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 238s # legitimately able to close the connection after sending a valid response. 238s # With this behaviour, the received response is still readable. 238s except BrokenPipeError: 238s pass 238s except OSError as e: 238s # MacOS/Linux 238s # EPROTOTYPE and ECONNRESET are needed on macOS 238s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 238s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 238s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 238s raise 238s 238s # Reset the timeout for the recv() on the socket 238s read_timeout = timeout_obj.read_timeout 238s 238s if not conn.is_closed: 238s # In Python 3 socket.py will catch EAGAIN and return None when you 238s # try and read into the file pointer created by http.client, which 238s # instead raises a BadStatusLine exception. Instead of catching 238s # the exception and assuming all BadStatusLine exceptions are read 238s # timeouts, check for a zero timeout before making the request. 238s if read_timeout == 0: 238s raise ReadTimeoutError( 238s self, url, f"Read timed out. (read timeout={read_timeout})" 238s ) 238s conn.timeout = read_timeout 238s 238s # Receive the response from the server 238s try: 238s response = conn.getresponse() 238s except (BaseSSLError, OSError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 238s raise 238s 238s # Set properties that are used by the pooling layer. 238s response.retries = retries 238s response._connection = response_conn # type: ignore[attr-defined] 238s response._pool = self # type: ignore[attr-defined] 238s 238s log.debug( 238s '%s://%s:%s "%s %s %s" %s %s', 238s self.scheme, 238s self.host, 238s self.port, 238s method, 238s url, 238s > response.version_string, 238s response.status, 238s response.length_remaining, 238s ) 238s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 238s 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 238s ----------------------------- Captured stderr call ----------------------------- 238s 127.0.0.1 - - [18/Jan/2025 02:31:46] "POST /post HTTP/1.1" 501 159 238s _______________________________ test_gzip[https] _______________________________ 238s 238s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_gzip_https_0') 238s httpbin_both = 238s verify_pool_mgr = 238s 238s def test_gzip(tmpdir, httpbin_both, verify_pool_mgr): 238s """ 238s Ensure that requests (actually urllib3) is able to automatically decompress 238s the response body 238s """ 238s url = httpbin_both.url + "/gzip" 238s response = verify_pool_mgr.request("GET", url) 238s 238s with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))): 238s > response = verify_pool_mgr.request("GET", url) 238s 238s tests/integration/test_urllib3.py:140: 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 238s return self.request_encode_url( 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 238s return self.urlopen(method, url, **extra_kw) 238s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 238s response = conn.urlopen(method, u.request_uri, **kw) 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 238s response = self._make_request( 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s 238s self = 238s conn = 238s method = 'GET', url = '/gzip', body = None, headers = {} 238s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 238s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 238s chunked = False, response_conn = None, preload_content = True 238s decode_content = True, enforce_content_length = True 238s 238s def _make_request( 238s self, 238s conn: BaseHTTPConnection, 238s method: str, 238s url: str, 238s body: _TYPE_BODY | None = None, 238s headers: typing.Mapping[str, str] | None = None, 238s retries: Retry | None = None, 238s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 238s chunked: bool = False, 238s response_conn: BaseHTTPConnection | None = None, 238s preload_content: bool = True, 238s decode_content: bool = True, 238s enforce_content_length: bool = True, 238s ) -> BaseHTTPResponse: 238s """ 238s Perform a request on a given urllib connection object taken from our 238s pool. 238s 238s :param conn: 238s a connection from one of our connection pools 238s 238s :param method: 238s HTTP request method (such as GET, POST, PUT, etc.) 238s 238s :param url: 238s The URL to perform the request on. 238s 238s :param body: 238s Data to send in the request body, either :class:`str`, :class:`bytes`, 238s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 238s 238s :param headers: 238s Dictionary of custom headers to send, such as User-Agent, 238s If-None-Match, etc. If None, pool headers are used. If provided, 238s these headers completely replace any pool-specific headers. 238s 238s :param retries: 238s Configure the number of retries to allow before raising a 238s :class:`~urllib3.exceptions.MaxRetryError` exception. 238s 238s Pass ``None`` to retry until you receive a response. Pass a 238s :class:`~urllib3.util.retry.Retry` object for fine-grained control 238s over different types of retries. 238s Pass an integer number to retry connection errors that many times, 238s but no other types of errors. Pass zero to never retry. 238s 238s If ``False``, then retries are disabled and any exception is raised 238s immediately. Also, instead of raising a MaxRetryError on redirects, 238s the redirect response will be returned. 238s 238s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 238s 238s :param timeout: 238s If specified, overrides the default timeout for this one 238s request. It may be a float (in seconds) or an instance of 238s :class:`urllib3.util.Timeout`. 238s 238s :param chunked: 238s If True, urllib3 will send the body using chunked transfer 238s encoding. Otherwise, urllib3 will send the body using the standard 238s content-length form. Defaults to False. 238s 238s :param response_conn: 238s Set this to ``None`` if you will handle releasing the connection or 238s set the connection to have the response release it. 238s 238s :param preload_content: 238s If True, the response's body will be preloaded during construction. 238s 238s :param decode_content: 238s If True, will attempt to decode the body based on the 238s 'content-encoding' header. 238s 238s :param enforce_content_length: 238s Enforce content length checking. Body returned by server must match 238s value of Content-Length header, if present. Otherwise, raise error. 238s """ 238s self.num_requests += 1 238s 238s timeout_obj = self._get_timeout(timeout) 238s timeout_obj.start_connect() 238s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 238s 238s try: 238s # Trigger any extra validation we need to do. 238s try: 238s self._validate_conn(conn) 238s except (SocketTimeout, BaseSSLError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 238s raise 238s 238s # _validate_conn() starts the connection to an HTTPS proxy 238s # so we need to wrap errors with 'ProxyError' here too. 238s except ( 238s OSError, 238s NewConnectionError, 238s TimeoutError, 238s BaseSSLError, 238s CertificateError, 238s SSLError, 238s ) as e: 238s new_e: Exception = e 238s if isinstance(e, (BaseSSLError, CertificateError)): 238s new_e = SSLError(e) 238s # If the connection didn't successfully connect to it's proxy 238s # then there 238s if isinstance( 238s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 238s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 238s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 238s raise new_e 238s 238s # conn.request() calls http.client.*.request, not the method in 238s # urllib3.request. It also calls makefile (recv) on the socket. 238s try: 238s conn.request( 238s method, 238s url, 238s body=body, 238s headers=headers, 238s chunked=chunked, 238s preload_content=preload_content, 238s decode_content=decode_content, 238s enforce_content_length=enforce_content_length, 238s ) 238s 238s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 238s # legitimately able to close the connection after sending a valid response. 238s # With this behaviour, the received response is still readable. 238s except BrokenPipeError: 238s pass 238s except OSError as e: 238s # MacOS/Linux 238s # EPROTOTYPE and ECONNRESET are needed on macOS 238s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 238s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 238s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 238s raise 238s 238s # Reset the timeout for the recv() on the socket 238s read_timeout = timeout_obj.read_timeout 238s 238s if not conn.is_closed: 238s # In Python 3 socket.py will catch EAGAIN and return None when you 238s # try and read into the file pointer created by http.client, which 238s # instead raises a BadStatusLine exception. Instead of catching 238s # the exception and assuming all BadStatusLine exceptions are read 238s # timeouts, check for a zero timeout before making the request. 238s if read_timeout == 0: 238s raise ReadTimeoutError( 238s self, url, f"Read timed out. (read timeout={read_timeout})" 238s ) 238s conn.timeout = read_timeout 238s 238s # Receive the response from the server 238s try: 238s response = conn.getresponse() 238s except (BaseSSLError, OSError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 238s raise 238s 238s # Set properties that are used by the pooling layer. 238s response.retries = retries 238s response._connection = response_conn # type: ignore[attr-defined] 238s response._pool = self # type: ignore[attr-defined] 238s 238s log.debug( 238s '%s://%s:%s "%s %s %s" %s %s', 238s self.scheme, 238s self.host, 238s self.port, 238s method, 238s url, 238s > response.version_string, 238s response.status, 238s response.length_remaining, 238s ) 238s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 238s 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 238s ----------------------------- Captured stderr call ----------------------------- 238s 127.0.0.1 - - [18/Jan/2025 02:31:46] "GET /gzip HTTP/1.1" 200 165 238s 127.0.0.1 - - [18/Jan/2025 02:31:46] "GET /gzip HTTP/1.1" 200 165 238s ________________________________ test_use_proxy ________________________________ 238s 238s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_use_proxy0') 238s httpbin = 238s proxy_server = 'http://0.0.0.0:33259' 238s 238s def test_use_proxy(tmpdir, httpbin, proxy_server): 238s """Ensure that it works with a proxy.""" 238s with vcr.use_cassette(str(tmpdir.join("proxy.yaml"))): 238s > response = requests.get(httpbin.url, proxies={"http": proxy_server}) 238s 238s tests/integration/test_proxy.py:53: 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s /usr/lib/python3/dist-packages/requests/api.py:73: in get 238s return request("get", url, params=params, **kwargs) 238s /usr/lib/python3/dist-packages/requests/api.py:59: in request 238s return session.request(method=method, url=url, **kwargs) 238s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 238s resp = self.send(prep, **send_kwargs) 238s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 238s r = adapter.send(request, **kwargs) 238s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 238s resp = conn.urlopen( 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 238s response = self._make_request( 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s 238s self = 238s conn = 238s method = 'GET', url = 'http://127.0.0.1:44235/', body = None 238s headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 238s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 238s timeout = Timeout(connect=None, read=None, total=None), chunked = False 238s response_conn = 238s preload_content = False, decode_content = False, enforce_content_length = True 238s 238s def _make_request( 238s self, 238s conn: BaseHTTPConnection, 238s method: str, 238s url: str, 238s body: _TYPE_BODY | None = None, 238s headers: typing.Mapping[str, str] | None = None, 238s retries: Retry | None = None, 238s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 238s chunked: bool = False, 238s response_conn: BaseHTTPConnection | None = None, 238s preload_content: bool = True, 238s decode_content: bool = True, 238s enforce_content_length: bool = True, 238s ) -> BaseHTTPResponse: 238s """ 238s Perform a request on a given urllib connection object taken from our 238s pool. 238s 238s :param conn: 238s a connection from one of our connection pools 238s 238s :param method: 238s HTTP request method (such as GET, POST, PUT, etc.) 238s 238s :param url: 238s The URL to perform the request on. 238s 238s :param body: 238s Data to send in the request body, either :class:`str`, :class:`bytes`, 238s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 238s 238s :param headers: 238s Dictionary of custom headers to send, such as User-Agent, 238s If-None-Match, etc. If None, pool headers are used. If provided, 238s these headers completely replace any pool-specific headers. 238s 238s :param retries: 238s Configure the number of retries to allow before raising a 238s :class:`~urllib3.exceptions.MaxRetryError` exception. 238s 238s Pass ``None`` to retry until you receive a response. Pass a 238s :class:`~urllib3.util.retry.Retry` object for fine-grained control 238s over different types of retries. 238s Pass an integer number to retry connection errors that many times, 238s but no other types of errors. Pass zero to never retry. 238s 238s If ``False``, then retries are disabled and any exception is raised 238s immediately. Also, instead of raising a MaxRetryError on redirects, 238s the redirect response will be returned. 238s 238s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 238s 238s :param timeout: 238s If specified, overrides the default timeout for this one 238s request. It may be a float (in seconds) or an instance of 238s :class:`urllib3.util.Timeout`. 238s 238s :param chunked: 238s If True, urllib3 will send the body using chunked transfer 238s encoding. Otherwise, urllib3 will send the body using the standard 238s content-length form. Defaults to False. 238s 238s :param response_conn: 238s Set this to ``None`` if you will handle releasing the connection or 238s set the connection to have the response release it. 238s 238s :param preload_content: 238s If True, the response's body will be preloaded during construction. 238s 238s :param decode_content: 238s If True, will attempt to decode the body based on the 238s 'content-encoding' header. 238s 238s :param enforce_content_length: 238s Enforce content length checking. Body returned by server must match 238s value of Content-Length header, if present. Otherwise, raise error. 238s """ 238s self.num_requests += 1 238s 238s timeout_obj = self._get_timeout(timeout) 238s timeout_obj.start_connect() 238s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 238s 238s try: 238s # Trigger any extra validation we need to do. 238s try: 238s self._validate_conn(conn) 238s except (SocketTimeout, BaseSSLError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 238s raise 238s 238s # _validate_conn() starts the connection to an HTTPS proxy 238s # so we need to wrap errors with 'ProxyError' here too. 238s except ( 238s OSError, 238s NewConnectionError, 238s TimeoutError, 238s BaseSSLError, 238s CertificateError, 238s SSLError, 238s ) as e: 238s new_e: Exception = e 238s if isinstance(e, (BaseSSLError, CertificateError)): 238s new_e = SSLError(e) 238s # If the connection didn't successfully connect to it's proxy 238s # then there 238s if isinstance( 238s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 238s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 238s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 238s raise new_e 238s 238s # conn.request() calls http.client.*.request, not the method in 238s # urllib3.request. It also calls makefile (recv) on the socket. 238s try: 238s conn.request( 238s method, 238s url, 238s body=body, 238s headers=headers, 238s chunked=chunked, 238s preload_content=preload_content, 238s decode_content=decode_content, 238s enforce_content_length=enforce_content_length, 238s ) 238s 238s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 238s # legitimately able to close the connection after sending a valid response. 238s # With this behaviour, the received response is still readable. 238s except BrokenPipeError: 238s pass 238s except OSError as e: 238s # MacOS/Linux 238s # EPROTOTYPE and ECONNRESET are needed on macOS 238s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 238s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 238s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 238s raise 238s 238s # Reset the timeout for the recv() on the socket 238s read_timeout = timeout_obj.read_timeout 238s 238s if not conn.is_closed: 238s # In Python 3 socket.py will catch EAGAIN and return None when you 238s # try and read into the file pointer created by http.client, which 238s # instead raises a BadStatusLine exception. Instead of catching 238s # the exception and assuming all BadStatusLine exceptions are read 238s # timeouts, check for a zero timeout before making the request. 238s if read_timeout == 0: 238s raise ReadTimeoutError( 238s self, url, f"Read timed out. (read timeout={read_timeout})" 238s ) 238s conn.timeout = read_timeout 238s 238s # Receive the response from the server 238s try: 238s response = conn.getresponse() 238s except (BaseSSLError, OSError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 238s raise 238s 238s # Set properties that are used by the pooling layer. 238s response.retries = retries 238s response._connection = response_conn # type: ignore[attr-defined] 238s response._pool = self # type: ignore[attr-defined] 238s 238s log.debug( 238s '%s://%s:%s "%s %s %s" %s %s', 238s self.scheme, 238s self.host, 238s self.port, 238s method, 238s url, 238s > response.version_string, 238s response.status, 238s response.length_remaining, 238s ) 238s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 238s 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 238s ----------------------------- Captured stderr call ----------------------------- 238s 127.0.0.1 - - [18/Jan/2025 02:31:46] "GET / HTTP/1.1" 200 9358 238s 127.0.0.1 - - [18/Jan/2025 02:31:46] "GET http://127.0.0.1:44235/ HTTP/1.1" 200 - 238s ______________________________ test_cross_scheme _______________________________ 238s 238s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_cross_scheme2') 238s httpbin = 238s httpbin_secure = 238s verify_pool_mgr = 238s 238s def test_cross_scheme(tmpdir, httpbin, httpbin_secure, verify_pool_mgr): 238s """Ensure that requests between schemes are treated separately""" 238s # First fetch a url under http, and then again under https and then 238s # ensure that we haven't served anything out of cache, and we have two 238s # requests / response pairs in the cassette 238s with vcr.use_cassette(str(tmpdir.join("cross_scheme.yaml"))) as cass: 238s > verify_pool_mgr.request("GET", httpbin_secure.url) 238s 238s tests/integration/test_urllib3.py:125: 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 238s return self.request_encode_url( 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 238s return self.urlopen(method, url, **extra_kw) 238s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 238s response = conn.urlopen(method, u.request_uri, **kw) 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 238s response = self._make_request( 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s 238s self = 238s conn = 238s method = 'GET', url = '/', body = None, headers = {} 238s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 238s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 238s chunked = False, response_conn = None, preload_content = True 238s decode_content = True, enforce_content_length = True 238s 238s def _make_request( 238s self, 238s conn: BaseHTTPConnection, 238s method: str, 238s url: str, 238s body: _TYPE_BODY | None = None, 238s headers: typing.Mapping[str, str] | None = None, 238s retries: Retry | None = None, 238s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 238s chunked: bool = False, 238s response_conn: BaseHTTPConnection | None = None, 238s preload_content: bool = True, 238s decode_content: bool = True, 238s enforce_content_length: bool = True, 238s ) -> BaseHTTPResponse: 238s """ 238s Perform a request on a given urllib connection object taken from our 238s pool. 238s 238s :param conn: 238s a connection from one of our connection pools 238s 238s :param method: 238s HTTP request method (such as GET, POST, PUT, etc.) 238s 238s :param url: 238s The URL to perform the request on. 238s 238s :param body: 238s Data to send in the request body, either :class:`str`, :class:`bytes`, 238s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 238s 238s :param headers: 238s Dictionary of custom headers to send, such as User-Agent, 238s If-None-Match, etc. If None, pool headers are used. If provided, 238s these headers completely replace any pool-specific headers. 238s 238s :param retries: 238s Configure the number of retries to allow before raising a 238s :class:`~urllib3.exceptions.MaxRetryError` exception. 238s 238s Pass ``None`` to retry until you receive a response. Pass a 238s :class:`~urllib3.util.retry.Retry` object for fine-grained control 238s over different types of retries. 238s Pass an integer number to retry connection errors that many times, 238s but no other types of errors. Pass zero to never retry. 238s 238s If ``False``, then retries are disabled and any exception is raised 238s immediately. Also, instead of raising a MaxRetryError on redirects, 238s the redirect response will be returned. 238s 238s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 238s 238s :param timeout: 238s If specified, overrides the default timeout for this one 238s request. It may be a float (in seconds) or an instance of 238s :class:`urllib3.util.Timeout`. 238s 238s :param chunked: 238s If True, urllib3 will send the body using chunked transfer 238s encoding. Otherwise, urllib3 will send the body using the standard 238s content-length form. Defaults to False. 238s 238s :param response_conn: 238s Set this to ``None`` if you will handle releasing the connection or 238s set the connection to have the response release it. 238s 238s :param preload_content: 238s If True, the response's body will be preloaded during construction. 238s 238s :param decode_content: 238s If True, will attempt to decode the body based on the 238s 'content-encoding' header. 238s 238s :param enforce_content_length: 238s Enforce content length checking. Body returned by server must match 238s value of Content-Length header, if present. Otherwise, raise error. 238s """ 238s self.num_requests += 1 238s 238s timeout_obj = self._get_timeout(timeout) 238s timeout_obj.start_connect() 238s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 238s 238s try: 238s # Trigger any extra validation we need to do. 238s try: 238s self._validate_conn(conn) 238s except (SocketTimeout, BaseSSLError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 238s raise 238s 238s # _validate_conn() starts the connection to an HTTPS proxy 238s # so we need to wrap errors with 'ProxyError' here too. 238s except ( 238s OSError, 238s NewConnectionError, 238s TimeoutError, 238s BaseSSLError, 238s CertificateError, 238s SSLError, 238s ) as e: 238s new_e: Exception = e 238s if isinstance(e, (BaseSSLError, CertificateError)): 238s new_e = SSLError(e) 238s # If the connection didn't successfully connect to it's proxy 238s # then there 238s if isinstance( 238s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 238s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 238s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 238s raise new_e 238s 238s # conn.request() calls http.client.*.request, not the method in 238s # urllib3.request. It also calls makefile (recv) on the socket. 238s try: 238s conn.request( 238s method, 238s url, 238s body=body, 238s headers=headers, 238s chunked=chunked, 238s preload_content=preload_content, 238s decode_content=decode_content, 238s enforce_content_length=enforce_content_length, 238s ) 238s 238s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 238s # legitimately able to close the connection after sending a valid response. 238s # With this behaviour, the received response is still readable. 238s except BrokenPipeError: 238s pass 238s except OSError as e: 238s # MacOS/Linux 238s # EPROTOTYPE and ECONNRESET are needed on macOS 238s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 238s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 238s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 238s raise 238s 238s # Reset the timeout for the recv() on the socket 238s read_timeout = timeout_obj.read_timeout 238s 238s if not conn.is_closed: 238s # In Python 3 socket.py will catch EAGAIN and return None when you 238s # try and read into the file pointer created by http.client, which 238s # instead raises a BadStatusLine exception. Instead of catching 238s # the exception and assuming all BadStatusLine exceptions are read 238s # timeouts, check for a zero timeout before making the request. 238s if read_timeout == 0: 238s raise ReadTimeoutError( 238s self, url, f"Read timed out. (read timeout={read_timeout})" 238s ) 238s conn.timeout = read_timeout 238s 238s # Receive the response from the server 238s try: 238s response = conn.getresponse() 238s except (BaseSSLError, OSError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 238s raise 238s 238s # Set properties that are used by the pooling layer. 238s response.retries = retries 238s response._connection = response_conn # type: ignore[attr-defined] 238s response._pool = self # type: ignore[attr-defined] 238s 238s log.debug( 238s '%s://%s:%s "%s %s %s" %s %s', 238s self.scheme, 238s self.host, 238s self.port, 238s method, 238s url, 238s > response.version_string, 238s response.status, 238s response.length_remaining, 238s ) 238s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 238s 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 238s ----------------------------- Captured stderr call ----------------------------- 238s 127.0.0.1 - - [18/Jan/2025 02:31:46] "GET / HTTP/1.1" 200 9358 238s ___________________ test_https_with_cert_validation_disabled ___________________ 238s 238s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_https_with_cert_validatio0') 238s httpbin_secure = 238s pool_mgr = 238s 238s def test_https_with_cert_validation_disabled(tmpdir, httpbin_secure, pool_mgr): 238s with vcr.use_cassette(str(tmpdir.join("cert_validation_disabled.yaml"))): 238s > pool_mgr.request("GET", httpbin_secure.url) 238s 238s tests/integration/test_urllib3.py:149: 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 238s return self.request_encode_url( 238s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 238s return self.urlopen(method, url, **extra_kw) 238s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 238s response = conn.urlopen(method, u.request_uri, **kw) 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 238s response = self._make_request( 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s 238s self = 238s conn = 238s method = 'GET', url = '/', body = None, headers = {} 238s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 238s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 238s chunked = False, response_conn = None, preload_content = True 238s decode_content = True, enforce_content_length = True 238s 238s def _make_request( 238s self, 238s conn: BaseHTTPConnection, 238s method: str, 238s url: str, 238s body: _TYPE_BODY | None = None, 238s headers: typing.Mapping[str, str] | None = None, 238s retries: Retry | None = None, 238s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 238s chunked: bool = False, 238s response_conn: BaseHTTPConnection | None = None, 238s preload_content: bool = True, 238s decode_content: bool = True, 238s enforce_content_length: bool = True, 238s ) -> BaseHTTPResponse: 238s """ 238s Perform a request on a given urllib connection object taken from our 238s pool. 238s 238s :param conn: 238s a connection from one of our connection pools 238s 238s :param method: 238s HTTP request method (such as GET, POST, PUT, etc.) 238s 238s :param url: 238s The URL to perform the request on. 238s 238s :param body: 238s Data to send in the request body, either :class:`str`, :class:`bytes`, 238s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 238s 238s :param headers: 238s Dictionary of custom headers to send, such as User-Agent, 238s If-None-Match, etc. If None, pool headers are used. If provided, 238s these headers completely replace any pool-specific headers. 238s 238s :param retries: 238s Configure the number of retries to allow before raising a 238s :class:`~urllib3.exceptions.MaxRetryError` exception. 238s 238s Pass ``None`` to retry until you receive a response. Pass a 238s :class:`~urllib3.util.retry.Retry` object for fine-grained control 238s over different types of retries. 238s Pass an integer number to retry connection errors that many times, 238s but no other types of errors. Pass zero to never retry. 238s 238s If ``False``, then retries are disabled and any exception is raised 238s immediately. Also, instead of raising a MaxRetryError on redirects, 238s the redirect response will be returned. 238s 238s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 238s 238s :param timeout: 238s If specified, overrides the default timeout for this one 238s request. It may be a float (in seconds) or an instance of 238s :class:`urllib3.util.Timeout`. 238s 238s :param chunked: 238s If True, urllib3 will send the body using chunked transfer 238s encoding. Otherwise, urllib3 will send the body using the standard 238s content-length form. Defaults to False. 238s 238s :param response_conn: 238s Set this to ``None`` if you will handle releasing the connection or 238s set the connection to have the response release it. 238s 238s :param preload_content: 238s If True, the response's body will be preloaded during construction. 238s 238s :param decode_content: 238s If True, will attempt to decode the body based on the 238s 'content-encoding' header. 238s 238s :param enforce_content_length: 238s Enforce content length checking. Body returned by server must match 238s value of Content-Length header, if present. Otherwise, raise error. 238s """ 238s self.num_requests += 1 238s 238s timeout_obj = self._get_timeout(timeout) 238s timeout_obj.start_connect() 238s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 238s 238s try: 238s # Trigger any extra validation we need to do. 238s try: 238s self._validate_conn(conn) 238s except (SocketTimeout, BaseSSLError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 238s raise 238s 238s # _validate_conn() starts the connection to an HTTPS proxy 238s # so we need to wrap errors with 'ProxyError' here too. 238s except ( 238s OSError, 238s NewConnectionError, 238s TimeoutError, 238s BaseSSLError, 238s CertificateError, 238s SSLError, 238s ) as e: 238s new_e: Exception = e 238s if isinstance(e, (BaseSSLError, CertificateError)): 238s new_e = SSLError(e) 238s # If the connection didn't successfully connect to it's proxy 238s # then there 238s if isinstance( 238s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 238s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 238s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 238s raise new_e 238s 238s # conn.request() calls http.client.*.request, not the method in 238s # urllib3.request. It also calls makefile (recv) on the socket. 238s try: 238s conn.request( 238s method, 238s url, 238s body=body, 238s headers=headers, 238s chunked=chunked, 238s preload_content=preload_content, 238s decode_content=decode_content, 238s enforce_content_length=enforce_content_length, 238s ) 238s 238s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 238s # legitimately able to close the connection after sending a valid response. 238s # With this behaviour, the received response is still readable. 238s except BrokenPipeError: 238s pass 238s except OSError as e: 238s # MacOS/Linux 238s # EPROTOTYPE and ECONNRESET are needed on macOS 238s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 238s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 238s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 238s raise 238s 238s # Reset the timeout for the recv() on the socket 238s read_timeout = timeout_obj.read_timeout 238s 238s if not conn.is_closed: 238s # In Python 3 socket.py will catch EAGAIN and return None when you 238s # try and read into the file pointer created by http.client, which 238s # instead raises a BadStatusLine exception. Instead of catching 238s # the exception and assuming all BadStatusLine exceptions are read 238s # timeouts, check for a zero timeout before making the request. 238s if read_timeout == 0: 238s raise ReadTimeoutError( 238s self, url, f"Read timed out. (read timeout={read_timeout})" 238s ) 238s conn.timeout = read_timeout 238s 238s # Receive the response from the server 238s try: 238s response = conn.getresponse() 238s except (BaseSSLError, OSError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 238s raise 238s 238s # Set properties that are used by the pooling layer. 238s response.retries = retries 238s response._connection = response_conn # type: ignore[attr-defined] 238s response._pool = self # type: ignore[attr-defined] 238s 238s log.debug( 238s '%s://%s:%s "%s %s %s" %s %s', 238s self.scheme, 238s self.host, 238s self.port, 238s method, 238s url, 238s > response.version_string, 238s response.status, 238s response.length_remaining, 238s ) 238s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 238s 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 238s ----------------------------- Captured stderr call ----------------------------- 238s 127.0.0.1 - - [18/Jan/2025 02:31:46] "GET / HTTP/1.1" 200 9358 238s _____________________________ test_domain_redirect _____________________________ 238s 238s def test_domain_redirect(): 238s """Ensure that redirects across domains are considered unique""" 238s # In this example, seomoz.org redirects to moz.com, and if those 238s # requests are considered identical, then we'll be stuck in a redirect 238s # loop. 238s url = "http://seomoz.org/" 238s with vcr.use_cassette("tests/fixtures/wild/domain_redirect.yaml") as cass: 238s > requests.get(url, headers={"User-Agent": "vcrpy-test"}) 238s 238s tests/integration/test_wild.py:20: 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s /usr/lib/python3/dist-packages/requests/api.py:73: in get 238s return request("get", url, params=params, **kwargs) 238s /usr/lib/python3/dist-packages/requests/api.py:59: in request 238s return session.request(method=method, url=url, **kwargs) 238s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 238s resp = self.send(prep, **send_kwargs) 238s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 238s r = adapter.send(request, **kwargs) 238s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 238s resp = conn.urlopen( 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 238s response = self._make_request( 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s 238s self = 238s conn = 238s method = 'GET', url = '/', body = None 238s headers = {'User-Agent': 'vcrpy-test', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 238s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 238s timeout = Timeout(connect=None, read=None, total=None), chunked = False 238s response_conn = 238s preload_content = False, decode_content = False, enforce_content_length = True 238s 238s def _make_request( 238s self, 238s conn: BaseHTTPConnection, 238s method: str, 238s url: str, 238s body: _TYPE_BODY | None = None, 238s headers: typing.Mapping[str, str] | None = None, 238s retries: Retry | None = None, 238s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 238s chunked: bool = False, 238s response_conn: BaseHTTPConnection | None = None, 238s preload_content: bool = True, 238s decode_content: bool = True, 238s enforce_content_length: bool = True, 238s ) -> BaseHTTPResponse: 238s """ 238s Perform a request on a given urllib connection object taken from our 238s pool. 238s 238s :param conn: 238s a connection from one of our connection pools 238s 238s :param method: 238s HTTP request method (such as GET, POST, PUT, etc.) 238s 238s :param url: 238s The URL to perform the request on. 238s 238s :param body: 238s Data to send in the request body, either :class:`str`, :class:`bytes`, 238s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 238s 238s :param headers: 238s Dictionary of custom headers to send, such as User-Agent, 238s If-None-Match, etc. If None, pool headers are used. If provided, 238s these headers completely replace any pool-specific headers. 238s 238s :param retries: 238s Configure the number of retries to allow before raising a 238s :class:`~urllib3.exceptions.MaxRetryError` exception. 238s 238s Pass ``None`` to retry until you receive a response. Pass a 238s :class:`~urllib3.util.retry.Retry` object for fine-grained control 238s over different types of retries. 238s Pass an integer number to retry connection errors that many times, 238s but no other types of errors. Pass zero to never retry. 238s 238s If ``False``, then retries are disabled and any exception is raised 238s immediately. Also, instead of raising a MaxRetryError on redirects, 238s the redirect response will be returned. 238s 238s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 238s 238s :param timeout: 238s If specified, overrides the default timeout for this one 238s request. It may be a float (in seconds) or an instance of 238s :class:`urllib3.util.Timeout`. 238s 238s :param chunked: 238s If True, urllib3 will send the body using chunked transfer 238s encoding. Otherwise, urllib3 will send the body using the standard 238s content-length form. Defaults to False. 238s 238s :param response_conn: 238s Set this to ``None`` if you will handle releasing the connection or 238s set the connection to have the response release it. 238s 238s :param preload_content: 238s If True, the response's body will be preloaded during construction. 238s 238s :param decode_content: 238s If True, will attempt to decode the body based on the 238s 'content-encoding' header. 238s 238s :param enforce_content_length: 238s Enforce content length checking. Body returned by server must match 238s value of Content-Length header, if present. Otherwise, raise error. 238s """ 238s self.num_requests += 1 238s 238s timeout_obj = self._get_timeout(timeout) 238s timeout_obj.start_connect() 238s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 238s 238s try: 238s # Trigger any extra validation we need to do. 238s try: 238s self._validate_conn(conn) 238s except (SocketTimeout, BaseSSLError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 238s raise 238s 238s # _validate_conn() starts the connection to an HTTPS proxy 238s # so we need to wrap errors with 'ProxyError' here too. 238s except ( 238s OSError, 238s NewConnectionError, 238s TimeoutError, 238s BaseSSLError, 238s CertificateError, 238s SSLError, 238s ) as e: 238s new_e: Exception = e 238s if isinstance(e, (BaseSSLError, CertificateError)): 238s new_e = SSLError(e) 238s # If the connection didn't successfully connect to it's proxy 238s # then there 238s if isinstance( 238s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 238s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 238s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 238s raise new_e 238s 238s # conn.request() calls http.client.*.request, not the method in 238s # urllib3.request. It also calls makefile (recv) on the socket. 238s try: 238s conn.request( 238s method, 238s url, 238s body=body, 238s headers=headers, 238s chunked=chunked, 238s preload_content=preload_content, 238s decode_content=decode_content, 238s enforce_content_length=enforce_content_length, 238s ) 238s 238s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 238s # legitimately able to close the connection after sending a valid response. 238s # With this behaviour, the received response is still readable. 238s except BrokenPipeError: 238s pass 238s except OSError as e: 238s # MacOS/Linux 238s # EPROTOTYPE and ECONNRESET are needed on macOS 238s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 238s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 238s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 238s raise 238s 238s # Reset the timeout for the recv() on the socket 238s read_timeout = timeout_obj.read_timeout 238s 238s if not conn.is_closed: 238s # In Python 3 socket.py will catch EAGAIN and return None when you 238s # try and read into the file pointer created by http.client, which 238s # instead raises a BadStatusLine exception. Instead of catching 238s # the exception and assuming all BadStatusLine exceptions are read 238s # timeouts, check for a zero timeout before making the request. 238s if read_timeout == 0: 238s raise ReadTimeoutError( 238s self, url, f"Read timed out. (read timeout={read_timeout})" 238s ) 238s conn.timeout = read_timeout 238s 238s # Receive the response from the server 238s try: 238s response = conn.getresponse() 238s except (BaseSSLError, OSError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 238s raise 238s 238s # Set properties that are used by the pooling layer. 238s response.retries = retries 238s response._connection = response_conn # type: ignore[attr-defined] 238s response._pool = self # type: ignore[attr-defined] 238s 238s log.debug( 238s '%s://%s:%s "%s %s %s" %s %s', 238s self.scheme, 238s self.host, 238s self.port, 238s method, 238s url, 238s > response.version_string, 238s response.status, 238s response.length_remaining, 238s ) 238s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 238s 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 238s _________________________________ test_cookies _________________________________ 238s 238s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_cookies0') 238s httpbin = 238s 238s def test_cookies(tmpdir, httpbin): 238s testfile = str(tmpdir.join("cookies.yml")) 238s with vcr.use_cassette(testfile): 238s with requests.Session() as s: 238s > s.get(httpbin.url + "/cookies/set?k1=v1&k2=v2") 238s 238s tests/integration/test_wild.py:67: 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 238s return self.request("GET", url, **kwargs) 238s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 238s resp = self.send(prep, **send_kwargs) 238s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 238s r = adapter.send(request, **kwargs) 238s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 238s resp = conn.urlopen( 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 238s response = self._make_request( 238s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 238s 238s self = 238s conn = 238s method = 'GET', url = '/cookies/set?k1=v1&k2=v2', body = None 238s headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 238s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 238s timeout = Timeout(connect=None, read=None, total=None), chunked = False 238s response_conn = 238s preload_content = False, decode_content = False, enforce_content_length = True 238s 238s def _make_request( 238s self, 238s conn: BaseHTTPConnection, 238s method: str, 238s url: str, 238s body: _TYPE_BODY | None = None, 238s headers: typing.Mapping[str, str] | None = None, 238s retries: Retry | None = None, 238s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 238s chunked: bool = False, 238s response_conn: BaseHTTPConnection | None = None, 238s preload_content: bool = True, 238s decode_content: bool = True, 238s enforce_content_length: bool = True, 238s ) -> BaseHTTPResponse: 238s """ 238s Perform a request on a given urllib connection object taken from our 238s pool. 238s 238s :param conn: 238s a connection from one of our connection pools 238s 238s :param method: 238s HTTP request method (such as GET, POST, PUT, etc.) 238s 238s :param url: 238s The URL to perform the request on. 238s 238s :param body: 238s Data to send in the request body, either :class:`str`, :class:`bytes`, 238s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 238s 238s :param headers: 238s Dictionary of custom headers to send, such as User-Agent, 238s If-None-Match, etc. If None, pool headers are used. If provided, 238s these headers completely replace any pool-specific headers. 238s 238s :param retries: 238s Configure the number of retries to allow before raising a 238s :class:`~urllib3.exceptions.MaxRetryError` exception. 238s 238s Pass ``None`` to retry until you receive a response. Pass a 238s :class:`~urllib3.util.retry.Retry` object for fine-grained control 238s over different types of retries. 238s Pass an integer number to retry connection errors that many times, 238s but no other types of errors. Pass zero to never retry. 238s 238s If ``False``, then retries are disabled and any exception is raised 238s immediately. Also, instead of raising a MaxRetryError on redirects, 238s the redirect response will be returned. 238s 238s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 238s 238s :param timeout: 238s If specified, overrides the default timeout for this one 238s request. It may be a float (in seconds) or an instance of 238s :class:`urllib3.util.Timeout`. 238s 238s :param chunked: 238s If True, urllib3 will send the body using chunked transfer 238s encoding. Otherwise, urllib3 will send the body using the standard 238s content-length form. Defaults to False. 238s 238s :param response_conn: 238s Set this to ``None`` if you will handle releasing the connection or 238s set the connection to have the response release it. 238s 238s :param preload_content: 238s If True, the response's body will be preloaded during construction. 238s 238s :param decode_content: 238s If True, will attempt to decode the body based on the 238s 'content-encoding' header. 238s 238s :param enforce_content_length: 238s Enforce content length checking. Body returned by server must match 238s value of Content-Length header, if present. Otherwise, raise error. 238s """ 238s self.num_requests += 1 238s 238s timeout_obj = self._get_timeout(timeout) 238s timeout_obj.start_connect() 238s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 238s 238s try: 238s # Trigger any extra validation we need to do. 238s try: 238s self._validate_conn(conn) 238s except (SocketTimeout, BaseSSLError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 238s raise 238s 238s # _validate_conn() starts the connection to an HTTPS proxy 238s # so we need to wrap errors with 'ProxyError' here too. 238s except ( 238s OSError, 238s NewConnectionError, 238s TimeoutError, 238s BaseSSLError, 238s CertificateError, 238s SSLError, 238s ) as e: 238s new_e: Exception = e 238s if isinstance(e, (BaseSSLError, CertificateError)): 238s new_e = SSLError(e) 238s # If the connection didn't successfully connect to it's proxy 238s # then there 238s if isinstance( 238s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 238s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 238s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 238s raise new_e 238s 238s # conn.request() calls http.client.*.request, not the method in 238s # urllib3.request. It also calls makefile (recv) on the socket. 238s try: 238s conn.request( 238s method, 238s url, 238s body=body, 238s headers=headers, 238s chunked=chunked, 238s preload_content=preload_content, 238s decode_content=decode_content, 238s enforce_content_length=enforce_content_length, 238s ) 238s 238s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 238s # legitimately able to close the connection after sending a valid response. 238s # With this behaviour, the received response is still readable. 238s except BrokenPipeError: 238s pass 238s except OSError as e: 238s # MacOS/Linux 238s # EPROTOTYPE and ECONNRESET are needed on macOS 238s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 238s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 238s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 238s raise 238s 238s # Reset the timeout for the recv() on the socket 238s read_timeout = timeout_obj.read_timeout 238s 238s if not conn.is_closed: 238s # In Python 3 socket.py will catch EAGAIN and return None when you 238s # try and read into the file pointer created by http.client, which 238s # instead raises a BadStatusLine exception. Instead of catching 238s # the exception and assuming all BadStatusLine exceptions are read 238s # timeouts, check for a zero timeout before making the request. 238s if read_timeout == 0: 238s raise ReadTimeoutError( 238s self, url, f"Read timed out. (read timeout={read_timeout})" 238s ) 238s conn.timeout = read_timeout 238s 238s # Receive the response from the server 238s try: 238s response = conn.getresponse() 238s except (BaseSSLError, OSError) as e: 238s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 238s raise 238s 238s # Set properties that are used by the pooling layer. 238s response.retries = retries 238s response._connection = response_conn # type: ignore[attr-defined] 238s response._pool = self # type: ignore[attr-defined] 238s 238s log.debug( 238s '%s://%s:%s "%s %s %s" %s %s', 238s self.scheme, 238s self.host, 238s self.port, 238s method, 238s url, 238s > response.version_string, 238s response.status, 238s response.length_remaining, 238s ) 238s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 238s 238s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 238s ----------------------------- Captured stderr call ----------------------------- 238s 127.0.0.1 - - [18/Jan/2025 02:31:47] "GET /cookies/set?k1=v1&k2=v2 HTTP/1.1" 302 203 238s =============================== warnings summary =============================== 238s tests/integration/test_config.py:10 238s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_config.py:10: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 238s @pytest.mark.online 238s 238s tests/integration/test_config.py:24 238s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_config.py:24: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 238s @pytest.mark.online 238s 238s tests/integration/test_config.py:34 238s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_config.py:34: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 238s @pytest.mark.online 238s 238s tests/integration/test_config.py:47 238s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_config.py:47: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 238s @pytest.mark.online 238s 238s tests/integration/test_config.py:69 238s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_config.py:69: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 238s @pytest.mark.online 238s 238s tests/integration/test_disksaver.py:14 238s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_disksaver.py:14: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 238s @pytest.mark.online 238s 238s tests/integration/test_disksaver.py:35 238s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_disksaver.py:35: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 238s @pytest.mark.online 238s 238s tests/integration/test_httplib2.py:60 238s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_httplib2.py:60: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 238s @pytest.mark.online 238s 238s tests/integration/test_register_matcher.py:16 238s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_register_matcher.py:16: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 238s @pytest.mark.online 238s 238s tests/integration/test_register_matcher.py:32 238s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_register_matcher.py:32: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 238s @pytest.mark.online 238s 238s tests/integration/test_urllib2.py:60 238s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_urllib2.py:60: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 238s @mark.online 238s 238s tests/integration/test_urllib3.py:102 238s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_urllib3.py:102: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 238s @pytest.mark.online 238s 238s tests/integration/test_wild.py:55 238s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_wild.py:55: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 238s @pytest.mark.online 238s 238s tests/integration/test_wild.py:74 238s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/integration/test_wild.py:74: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 238s @pytest.mark.online 238s 238s tests/unit/test_stubs.py:20 238s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/unit/test_stubs.py:20: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 238s @mark.online 238s 238s tests/unit/test_unittest.py:131 238s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/unit/test_unittest.py:131: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 238s @pytest.mark.online 238s 238s tests/unit/test_unittest.py:166 238s /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build/tests/unit/test_unittest.py:166: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 238s @pytest.mark.online 238s 238s tests/integration/test_wild.py::test_xmlrpclib 238s /usr/lib/python3.12/multiprocessing/popen_fork.py:66: DeprecationWarning: This process (pid=2904) is multi-threaded, use of fork() may lead to deadlocks in the child. 238s self.pid = os.fork() 238s 238s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 238s =========================== short test summary info ============================ 238s FAILED tests/integration/test_urllib3.py::test_status_code[http] - AttributeE... 238s FAILED tests/integration/test_urllib3.py::test_headers[http] - AttributeError... 238s FAILED tests/integration/test_urllib3.py::test_body[http] - AttributeError: '... 238s FAILED tests/integration/test_urllib3.py::test_auth[http] - AttributeError: '... 238s FAILED tests/integration/test_urllib3.py::test_auth_failed[http] - AttributeE... 238s FAILED tests/integration/test_urllib3.py::test_post[http] - AttributeError: '... 238s FAILED tests/integration/test_urllib3.py::test_gzip[http] - AttributeError: '... 238s FAILED tests/integration/test_urllib3.py::test_status_code[https] - Attribute... 238s FAILED tests/integration/test_urllib3.py::test_headers[https] - AttributeErro... 238s FAILED tests/integration/test_urllib3.py::test_body[https] - AttributeError: ... 238s FAILED tests/integration/test_urllib3.py::test_auth[https] - AttributeError: ... 238s FAILED tests/integration/test_urllib3.py::test_auth_failed[https] - Attribute... 238s FAILED tests/integration/test_urllib3.py::test_post[https] - AttributeError: ... 238s FAILED tests/integration/test_urllib3.py::test_gzip[https] - AttributeError: ... 238s FAILED tests/integration/test_proxy.py::test_use_proxy - AttributeError: 'VCR... 238s FAILED tests/integration/test_urllib3.py::test_cross_scheme - AttributeError:... 238s FAILED tests/integration/test_urllib3.py::test_https_with_cert_validation_disabled 238s FAILED tests/integration/test_wild.py::test_domain_redirect - AttributeError:... 238s FAILED tests/integration/test_wild.py::test_cookies - AttributeError: 'VCRHTT... 238s ==== 19 failed, 265 passed, 3 skipped, 19 deselected, 18 warnings in 4.06s ===== 238s E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd /tmp/autopkgtest.vKDz2s/autopkgtest_tmp/build; python3.12 -m pytest --ignore tests/integration/test_aiohttp.py --ignore tests/integration/test_tornado.py --ignore tests/integration/test_requests.py -m "not online" -k "not test_basic_json_use and not test_load_cassette_with_custom_persister" 238s pybuild-autopkgtest: error: pybuild --autopkgtest --test-pytest -i python{version} -p "3.13 3.12" returned exit code 13 238s make: *** [/tmp/NYWQVi6k19/run:4: pybuild-autopkgtest] Error 25 238s pybuild-autopkgtest: error: /tmp/NYWQVi6k19/run pybuild-autopkgtest returned exit code 2 238s autopkgtest [02:31:48]: test pybuild-autopkgtest: -----------------------] 239s pybuild-autopkgtest FAIL non-zero exit status 25 239s autopkgtest [02:31:49]: test pybuild-autopkgtest: - - - - - - - - - - results - - - - - - - - - - 239s autopkgtest [02:31:49]: @@@@@@@@@@@@@@@@@@@@ summary 239s pybuild-autopkgtest FAIL non-zero exit status 25 257s nova [W] Using flock in prodstack6-arm64 257s Creating nova instance adt-plucky-arm64-vcr.py-20250118-022750-juju-7f2275-prod-proposed-migration-environment-15-f0f15dcd-3476-4b12-86bb-11581c936de7 from image adt/ubuntu-plucky-arm64-server-20250117.img (UUID 16a981e8-12f4-4912-806e-ebb4c2361146)... 257s nova [W] Timed out waiting for 4f27a5bc-7236-4b6c-a4ae-3d689296457b to get deleted.