0s autopkgtest [20:21:36]: starting date and time: 2025-01-19 20:21:36+0000 0s autopkgtest [20:21:36]: git checkout: 325255d2 Merge branch 'pin-any-arch' into 'ubuntu/production' 0s autopkgtest [20:21:36]: host juju-7f2275-prod-proposed-migration-environment-2; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.oooghwd3/out --timeout-copy=6000 --setup-commands /home/ubuntu/autopkgtest-cloud/worker-config-production/setup-canonical.sh --apt-pocket=proposed=src:python-urllib3 --apt-upgrade vcr.py --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 --env=ADT_TEST_TRIGGERS=python-urllib3/2.3.0-1 -- ssh -s /home/ubuntu/autopkgtest/ssh-setup/nova -- --flavor builder-cpu2-ram4-disk20 --security-groups autopkgtest-juju-7f2275-prod-proposed-migration-environment-2@bos03-1.secgroup --name adt-plucky-amd64-vcr.py-20250119-202136-juju-7f2275-prod-proposed-migration-environment-2-cda10e07-16b2-4784-9314-63314e3265f8 --image adt/ubuntu-plucky-amd64-server --keyname testbed-juju-7f2275-prod-proposed-migration-environment-2 --net-id=net_prod-proposed-migration-amd64 -e TERM=linux -e ''"'"'http_proxy=http://squid.internal:3128'"'"'' -e ''"'"'https_proxy=http://squid.internal:3128'"'"'' -e ''"'"'no_proxy=127.0.0.1,127.0.1.1,login.ubuntu.com,localhost,localdomain,novalocal,internal,archive.ubuntu.com,ports.ubuntu.com,security.ubuntu.com,ddebs.ubuntu.com,changelogs.ubuntu.com,keyserver.ubuntu.com,launchpadlibrarian.net,launchpadcontent.net,launchpad.net,10.24.0.0/24,keystone.ps5.canonical.com,objectstorage.prodstack5.canonical.com,radosgw.ps5.canonical.com'"'"'' --mirror=http://ftpmaster.internal/ubuntu/ 55s autopkgtest [20:22:31]: testbed dpkg architecture: amd64 55s autopkgtest [20:22:31]: testbed apt version: 2.9.18 55s autopkgtest [20:22:31]: @@@@@@@@@@@@@@@@@@@@ test bed setup 55s autopkgtest [20:22:31]: testbed release detected to be: None 56s autopkgtest [20:22:32]: updating testbed package index (apt update) 57s Get:1 http://ftpmaster.internal/ubuntu plucky-proposed InRelease [73.9 kB] 57s Hit:2 http://ftpmaster.internal/ubuntu plucky InRelease 57s Hit:3 http://ftpmaster.internal/ubuntu plucky-updates InRelease 57s Hit:4 http://ftpmaster.internal/ubuntu plucky-security InRelease 57s Get:5 http://ftpmaster.internal/ubuntu plucky-proposed/universe Sources [783 kB] 57s Get:6 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse Sources [14.6 kB] 57s Get:7 http://ftpmaster.internal/ubuntu plucky-proposed/restricted Sources [9708 B] 57s Get:8 http://ftpmaster.internal/ubuntu plucky-proposed/main Sources [146 kB] 57s Get:9 http://ftpmaster.internal/ubuntu plucky-proposed/main amd64 Packages [276 kB] 57s Get:10 http://ftpmaster.internal/ubuntu plucky-proposed/main i386 Packages [188 kB] 57s Get:11 http://ftpmaster.internal/ubuntu plucky-proposed/restricted amd64 Packages [40.1 kB] 57s Get:12 http://ftpmaster.internal/ubuntu plucky-proposed/restricted i386 Packages [2408 B] 57s Get:13 http://ftpmaster.internal/ubuntu plucky-proposed/universe i386 Packages [381 kB] 57s Get:14 http://ftpmaster.internal/ubuntu plucky-proposed/universe amd64 Packages [915 kB] 57s Get:15 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse amd64 Packages [24.6 kB] 57s Get:16 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse i386 Packages [4116 B] 57s Fetched 2858 kB in 1s (3005 kB/s) 58s Reading package lists... 59s + lsb_release --codename --short 59s + RELEASE=plucky 59s + cat 59s + [ plucky != trusty ] 59s + DEBIAN_FRONTEND=noninteractive eatmydata apt-get -y --allow-downgrades -o Dpkg::Options::=--force-confnew dist-upgrade 59s Reading package lists... 59s Building dependency tree... 59s Reading state information... 59s Calculating upgrade... 59s The following packages will be upgraded: 59s gir1.2-glib-2.0 libglib2.0-0t64 libglib2.0-bin libglib2.0-data liblz4-1 59s libzstd1 python3.13-gdbm zstd 59s 8 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 59s Need to get 3233 kB of archives. 59s After this operation, 601 kB of additional disk space will be used. 59s Get:1 http://ftpmaster.internal/ubuntu plucky/main amd64 libzstd1 amd64 1.5.6+dfsg-2 [369 kB] 60s Get:2 http://ftpmaster.internal/ubuntu plucky/main amd64 liblz4-1 amd64 1.9.4-4 [63.9 kB] 60s Get:3 http://ftpmaster.internal/ubuntu plucky/main amd64 libglib2.0-data all 2.82.4-2 [52.3 kB] 60s Get:4 http://ftpmaster.internal/ubuntu plucky/main amd64 libglib2.0-bin amd64 2.82.4-2 [103 kB] 60s Get:5 http://ftpmaster.internal/ubuntu plucky/main amd64 gir1.2-glib-2.0 amd64 2.82.4-2 [182 kB] 60s Get:6 http://ftpmaster.internal/ubuntu plucky/main amd64 libglib2.0-0t64 amd64 2.82.4-2 [1656 kB] 60s Get:7 http://ftpmaster.internal/ubuntu plucky/main amd64 python3.13-gdbm amd64 3.13.1-3 [31.7 kB] 60s Get:8 http://ftpmaster.internal/ubuntu plucky/main amd64 zstd amd64 1.5.6+dfsg-2 [775 kB] 60s Fetched 3233 kB in 1s (4680 kB/s) 60s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 89449 files and directories currently installed.) 60s Preparing to unpack .../libzstd1_1.5.6+dfsg-2_amd64.deb ... 60s Unpacking libzstd1:amd64 (1.5.6+dfsg-2) over (1.5.6+dfsg-1) ... 60s Setting up libzstd1:amd64 (1.5.6+dfsg-2) ... 61s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 89449 files and directories currently installed.) 61s Preparing to unpack .../liblz4-1_1.9.4-4_amd64.deb ... 61s Unpacking liblz4-1:amd64 (1.9.4-4) over (1.9.4-3) ... 61s Setting up liblz4-1:amd64 (1.9.4-4) ... 61s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 89449 files and directories currently installed.) 61s Preparing to unpack .../0-libglib2.0-data_2.82.4-2_all.deb ... 61s Unpacking libglib2.0-data (2.82.4-2) over (2.82.4-1) ... 61s Preparing to unpack .../1-libglib2.0-bin_2.82.4-2_amd64.deb ... 61s Unpacking libglib2.0-bin (2.82.4-2) over (2.82.4-1) ... 61s Preparing to unpack .../2-gir1.2-glib-2.0_2.82.4-2_amd64.deb ... 61s Unpacking gir1.2-glib-2.0:amd64 (2.82.4-2) over (2.82.4-1) ... 61s Preparing to unpack .../3-libglib2.0-0t64_2.82.4-2_amd64.deb ... 61s Unpacking libglib2.0-0t64:amd64 (2.82.4-2) over (2.82.4-1) ... 61s Preparing to unpack .../4-python3.13-gdbm_3.13.1-3_amd64.deb ... 61s Unpacking python3.13-gdbm (3.13.1-3) over (3.13.1-2) ... 61s Preparing to unpack .../5-zstd_1.5.6+dfsg-2_amd64.deb ... 61s Unpacking zstd (1.5.6+dfsg-2) over (1.5.6+dfsg-1) ... 61s Setting up libglib2.0-0t64:amd64 (2.82.4-2) ... 61s No schema files found: doing nothing. 61s Setting up libglib2.0-data (2.82.4-2) ... 61s Setting up gir1.2-glib-2.0:amd64 (2.82.4-2) ... 61s Setting up zstd (1.5.6+dfsg-2) ... 61s Setting up python3.13-gdbm (3.13.1-3) ... 61s Setting up libglib2.0-bin (2.82.4-2) ... 61s Processing triggers for libc-bin (2.40-4ubuntu1) ... 61s Processing triggers for man-db (2.13.0-1) ... 62s 62s Running kernel seems to be up-to-date. 62s 62s Restarting services... 62s /etc/needrestart/restart.d/systemd-manager 62s systemctl restart packagekit.service polkit.service ssh.service systemd-fsckd.service systemd-journald.service systemd-networkd.service systemd-resolved.service systemd-timesyncd.service systemd-udevd.service udisks2.service 63s 63s Service restarts being deferred: 63s systemctl restart ModemManager.service 63s systemctl restart systemd-logind.service 63s 63s No containers need to be restarted. 63s 63s User sessions running outdated binaries: 63s ubuntu @ session #4: sshd-session[1199] 63s ubuntu @ user manager service: systemd[956] 63s 63s No VM guests are running outdated hypervisor (qemu) binaries on this host. 63s + rm /etc/apt/preferences.d/force-downgrade-to-release.pref 63s + /usr/lib/apt/apt-helper analyze-pattern ?true 63s + DEBIAN_FRONTEND=noninteractive eatmydata apt-get -y purge --autoremove ?obsolete 63s Reading package lists... 64s Building dependency tree... 64s Reading state information... 64s 0 upgraded, 0 newly installed, 0 to remove and 1 not upgraded. 64s + grep -q trusty /etc/lsb-release 64s + [ ! -d /usr/share/doc/unattended-upgrades ] 64s + [ ! -d /usr/share/doc/lxd ] 64s + [ ! -d /usr/share/doc/lxd-client ] 64s + [ ! -d /usr/share/doc/snapd ] 64s + type iptables 64s + cat 64s + chmod 755 /etc/rc.local 64s + . /etc/rc.local 64s + iptables -w -t mangle -A FORWARD -p tcp --tcp-flags SYN,RST SYN -j TCPMSS --clamp-mss-to-pmtu 64s + iptables -A OUTPUT -d 10.255.255.1/32 -p tcp -j DROP 64s + iptables -A OUTPUT -d 10.255.255.2/32 -p tcp -j DROP 64s + uname -m 64s + [ x86_64 = ppc64le ] 64s + [ -d /run/systemd/system ] 64s + systemd-detect-virt --quiet --vm 64s + mkdir -p /etc/systemd/system/systemd-random-seed.service.d/ 64s + cat 64s + grep -q lz4 /etc/initramfs-tools/initramfs.conf 64s + echo COMPRESS=lz4 64s + sync 64s autopkgtest [20:22:40]: upgrading testbed (apt dist-upgrade and autopurge) 64s Reading package lists... 64s Building dependency tree... 64s Reading state information... 65s Calculating upgrade...Starting pkgProblemResolver with broken count: 0 65s Starting 2 pkgProblemResolver with broken count: 0 65s Done 65s Entering ResolveByKeep 65s 65s The following packages will be upgraded: 65s python3-urllib3 65s 1 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 65s Need to get 94.0 kB of archives. 65s After this operation, 18.4 kB of additional disk space will be used. 65s Get:1 http://ftpmaster.internal/ubuntu plucky-proposed/main amd64 python3-urllib3 all 2.3.0-1 [94.0 kB] 66s Fetched 94.0 kB in 0s (343 kB/s) 66s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 89449 files and directories currently installed.) 66s Preparing to unpack .../python3-urllib3_2.3.0-1_all.deb ... 66s Unpacking python3-urllib3 (2.3.0-1) over (2.0.7-2ubuntu0.1) ... 66s Setting up python3-urllib3 (2.3.0-1) ... 66s 66s Running kernel seems to be up-to-date. 66s 66s Restarting services... 66s 66s Service restarts being deferred: 66s systemctl restart systemd-logind.service 66s 66s No containers need to be restarted. 66s 66s User sessions running outdated binaries: 66s ubuntu @ session #4: sshd-session[1199] 66s ubuntu @ user manager service: systemd[956] 66s 66s No VM guests are running outdated hypervisor (qemu) binaries on this host. 67s Reading package lists... 67s Building dependency tree... 67s Reading state information... 68s Starting pkgProblemResolver with broken count: 0 68s Starting 2 pkgProblemResolver with broken count: 0 68s Done 68s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 70s autopkgtest [20:22:46]: testbed running kernel: Linux 6.11.0-8-generic #8-Ubuntu SMP PREEMPT_DYNAMIC Mon Sep 16 13:41:20 UTC 2024 70s autopkgtest [20:22:46]: @@@@@@@@@@@@@@@@@@@@ apt-source vcr.py 72s Get:1 http://ftpmaster.internal/ubuntu plucky/universe vcr.py 6.0.2-2 (dsc) [2977 B] 72s Get:2 http://ftpmaster.internal/ubuntu plucky/universe vcr.py 6.0.2-2 (tar) [339 kB] 72s Get:3 http://ftpmaster.internal/ubuntu plucky/universe vcr.py 6.0.2-2 (diff) [6348 B] 72s gpgv: Signature made Tue Dec 17 14:55:48 2024 UTC 72s gpgv: using RSA key AC0A4FF12611B6FCCF01C111393587D97D86500B 72s gpgv: Can't check signature: No public key 72s dpkg-source: warning: cannot verify inline signature for ./vcr.py_6.0.2-2.dsc: no acceptable signature found 72s autopkgtest [20:22:48]: testing package vcr.py version 6.0.2-2 73s autopkgtest [20:22:49]: build not needed 73s autopkgtest [20:22:49]: test pybuild-autopkgtest: preparing testbed 73s Reading package lists... 74s Building dependency tree... 74s Reading state information... 74s Starting pkgProblemResolver with broken count: 0 74s Starting 2 pkgProblemResolver with broken count: 0 74s Done 74s The following NEW packages will be installed: 74s autoconf automake autopoint autotools-dev build-essential cpp cpp-14 74s cpp-14-x86-64-linux-gnu cpp-x86-64-linux-gnu debhelper debugedit 74s dh-autoreconf dh-python dh-strip-nondeterminism docutils-common dwz 74s fonts-font-awesome fonts-lato g++ g++-14 g++-14-x86-64-linux-gnu 74s g++-x86-64-linux-gnu gcc gcc-14 gcc-14-x86-64-linux-gnu gcc-x86-64-linux-gnu 74s gettext intltool-debian libarchive-zip-perl libasan8 libcc1-0 74s libdebhelper-perl libfile-stripnondeterminism-perl libgcc-14-dev libgomp1 74s libhwasan0 libisl23 libitm1 libjs-jquery libjs-sphinxdoc libjs-underscore 74s libjson-perl liblsan0 liblua5.4-0 libmpc3 libpython3.13-minimal 74s libpython3.13-stdlib libquadmath0 libstdc++-14-dev libtool libtsan2 74s libubsan1 m4 pandoc pandoc-data po-debconf pybuild-plugin-autopkgtest 74s python-vcr-doc python3-aiohappyeyeballs python3-aiohttp python3-aiosignal 74s python3-alabaster python3-all python3-async-timeout python3-brotli 74s python3-brotlicffi python3-decorator python3-defusedxml python3-docutils 74s python3-flasgger python3-flask python3-frozenlist python3-greenlet 74s python3-httpbin python3-imagesize python3-iniconfig python3-itsdangerous 74s python3-mistune python3-multidict python3-pluggy python3-pytest 74s python3-pytest-httpbin python3-pytest-tornado python3-roman 74s python3-snowballstemmer python3-sphinx python3-sphinx-rtd-theme 74s python3-sphinxcontrib.jquery python3-tornado python3-vcr python3-werkzeug 74s python3-wrapt python3-yarl python3.13 python3.13-minimal sphinx-common 74s sphinx-rtd-theme-common 74s 0 upgraded, 97 newly installed, 0 to remove and 0 not upgraded. 74s Need to get 115 MB of archives. 74s After this operation, 508 MB of additional disk space will be used. 74s Get:1 http://ftpmaster.internal/ubuntu plucky/main amd64 fonts-lato all 2.015-1 [2781 kB] 75s Get:2 http://ftpmaster.internal/ubuntu plucky/main amd64 libpython3.13-minimal amd64 3.13.1-3 [881 kB] 75s Get:3 http://ftpmaster.internal/ubuntu plucky/main amd64 python3.13-minimal amd64 3.13.1-3 [2358 kB] 76s Get:4 http://ftpmaster.internal/ubuntu plucky/main amd64 m4 amd64 1.4.19-5 [263 kB] 76s Get:5 http://ftpmaster.internal/ubuntu plucky/main amd64 autoconf all 2.72-3 [382 kB] 76s Get:6 http://ftpmaster.internal/ubuntu plucky/main amd64 autotools-dev all 20220109.1 [44.9 kB] 76s Get:7 http://ftpmaster.internal/ubuntu plucky/main amd64 automake all 1:1.16.5-1.3ubuntu1 [558 kB] 76s Get:8 http://ftpmaster.internal/ubuntu plucky/main amd64 autopoint all 0.22.5-3 [616 kB] 76s Get:9 http://ftpmaster.internal/ubuntu plucky/main amd64 libisl23 amd64 0.27-1 [685 kB] 76s Get:10 http://ftpmaster.internal/ubuntu plucky/main amd64 libmpc3 amd64 1.3.1-1build2 [55.3 kB] 76s Get:11 http://ftpmaster.internal/ubuntu plucky/main amd64 cpp-14-x86-64-linux-gnu amd64 14.2.0-13ubuntu1 [11.9 MB] 77s Get:12 http://ftpmaster.internal/ubuntu plucky/main amd64 cpp-14 amd64 14.2.0-13ubuntu1 [1032 B] 77s Get:13 http://ftpmaster.internal/ubuntu plucky/main amd64 cpp-x86-64-linux-gnu amd64 4:14.1.0-2ubuntu1 [5452 B] 77s Get:14 http://ftpmaster.internal/ubuntu plucky/main amd64 cpp amd64 4:14.1.0-2ubuntu1 [22.4 kB] 77s Get:15 http://ftpmaster.internal/ubuntu plucky/main amd64 libcc1-0 amd64 14.2.0-13ubuntu1 [47.6 kB] 77s Get:16 http://ftpmaster.internal/ubuntu plucky/main amd64 libgomp1 amd64 14.2.0-13ubuntu1 [148 kB] 77s Get:17 http://ftpmaster.internal/ubuntu plucky/main amd64 libitm1 amd64 14.2.0-13ubuntu1 [29.1 kB] 77s Get:18 http://ftpmaster.internal/ubuntu plucky/main amd64 libasan8 amd64 14.2.0-13ubuntu1 [2998 kB] 77s Get:19 http://ftpmaster.internal/ubuntu plucky/main amd64 liblsan0 amd64 14.2.0-13ubuntu1 [1317 kB] 77s Get:20 http://ftpmaster.internal/ubuntu plucky/main amd64 libtsan2 amd64 14.2.0-13ubuntu1 [2732 kB] 78s Get:21 http://ftpmaster.internal/ubuntu plucky/main amd64 libubsan1 amd64 14.2.0-13ubuntu1 [1177 kB] 78s Get:22 http://ftpmaster.internal/ubuntu plucky/main amd64 libhwasan0 amd64 14.2.0-13ubuntu1 [1634 kB] 78s Get:23 http://ftpmaster.internal/ubuntu plucky/main amd64 libquadmath0 amd64 14.2.0-13ubuntu1 [153 kB] 78s Get:24 http://ftpmaster.internal/ubuntu plucky/main amd64 libgcc-14-dev amd64 14.2.0-13ubuntu1 [2815 kB] 78s Get:25 http://ftpmaster.internal/ubuntu plucky/main amd64 gcc-14-x86-64-linux-gnu amd64 14.2.0-13ubuntu1 [23.4 MB] 79s Get:26 http://ftpmaster.internal/ubuntu plucky/main amd64 gcc-14 amd64 14.2.0-13ubuntu1 [534 kB] 79s Get:27 http://ftpmaster.internal/ubuntu plucky/main amd64 gcc-x86-64-linux-gnu amd64 4:14.1.0-2ubuntu1 [1214 B] 79s Get:28 http://ftpmaster.internal/ubuntu plucky/main amd64 gcc amd64 4:14.1.0-2ubuntu1 [5000 B] 79s Get:29 http://ftpmaster.internal/ubuntu plucky/main amd64 libstdc++-14-dev amd64 14.2.0-13ubuntu1 [2508 kB] 79s Get:30 http://ftpmaster.internal/ubuntu plucky/main amd64 g++-14-x86-64-linux-gnu amd64 14.2.0-13ubuntu1 [13.3 MB] 79s Get:31 http://ftpmaster.internal/ubuntu plucky/main amd64 g++-14 amd64 14.2.0-13ubuntu1 [21.1 kB] 79s Get:32 http://ftpmaster.internal/ubuntu plucky/main amd64 g++-x86-64-linux-gnu amd64 4:14.1.0-2ubuntu1 [966 B] 79s Get:33 http://ftpmaster.internal/ubuntu plucky/main amd64 g++ amd64 4:14.1.0-2ubuntu1 [1100 B] 79s Get:34 http://ftpmaster.internal/ubuntu plucky/main amd64 build-essential amd64 12.10ubuntu1 [4928 B] 79s Get:35 http://ftpmaster.internal/ubuntu plucky/main amd64 libdebhelper-perl all 13.20ubuntu1 [94.2 kB] 79s Get:36 http://ftpmaster.internal/ubuntu plucky/main amd64 libtool all 2.4.7-8 [166 kB] 79s Get:37 http://ftpmaster.internal/ubuntu plucky/main amd64 dh-autoreconf all 20 [16.1 kB] 79s Get:38 http://ftpmaster.internal/ubuntu plucky/main amd64 libarchive-zip-perl all 1.68-1 [90.2 kB] 79s Get:39 http://ftpmaster.internal/ubuntu plucky/main amd64 libfile-stripnondeterminism-perl all 1.14.0-1 [20.1 kB] 79s Get:40 http://ftpmaster.internal/ubuntu plucky/main amd64 dh-strip-nondeterminism all 1.14.0-1 [5058 B] 79s Get:41 http://ftpmaster.internal/ubuntu plucky/main amd64 debugedit amd64 1:5.1-1 [46.9 kB] 79s Get:42 http://ftpmaster.internal/ubuntu plucky/main amd64 dwz amd64 0.15-1build6 [115 kB] 79s Get:43 http://ftpmaster.internal/ubuntu plucky/main amd64 gettext amd64 0.22.5-3 [1025 kB] 79s Get:44 http://ftpmaster.internal/ubuntu plucky/main amd64 intltool-debian all 0.35.0+20060710.6 [23.2 kB] 79s Get:45 http://ftpmaster.internal/ubuntu plucky/main amd64 po-debconf all 1.0.21+nmu1 [233 kB] 79s Get:46 http://ftpmaster.internal/ubuntu plucky/main amd64 debhelper all 13.20ubuntu1 [893 kB] 79s Get:47 http://ftpmaster.internal/ubuntu plucky/universe amd64 dh-python all 6.20241217 [117 kB] 79s Get:48 http://ftpmaster.internal/ubuntu plucky/main amd64 docutils-common all 0.21.2+dfsg-2 [131 kB] 79s Get:49 http://ftpmaster.internal/ubuntu plucky/main amd64 fonts-font-awesome all 5.0.10+really4.7.0~dfsg-4.1 [516 kB] 79s Get:50 http://ftpmaster.internal/ubuntu plucky/main amd64 libjs-jquery all 3.6.1+dfsg+~3.5.14-1 [328 kB] 79s Get:51 http://ftpmaster.internal/ubuntu plucky/main amd64 libjs-underscore all 1.13.4~dfsg+~1.11.4-3 [118 kB] 79s Get:52 http://ftpmaster.internal/ubuntu plucky/main amd64 libjs-sphinxdoc all 8.1.3-3 [30.9 kB] 79s Get:53 http://ftpmaster.internal/ubuntu plucky/main amd64 libjson-perl all 4.10000-1 [81.9 kB] 79s Get:54 http://ftpmaster.internal/ubuntu plucky/main amd64 liblua5.4-0 amd64 5.4.7-1 [196 kB] 79s Get:55 http://ftpmaster.internal/ubuntu plucky/main amd64 libpython3.13-stdlib amd64 3.13.1-3 [2087 kB] 79s Get:56 http://ftpmaster.internal/ubuntu plucky/universe amd64 pandoc-data all 3.1.11.1-3build1 [78.8 kB] 79s Get:57 http://ftpmaster.internal/ubuntu plucky/universe amd64 pandoc amd64 3.1.11.1+ds-2 [27.2 MB] 80s Get:58 http://ftpmaster.internal/ubuntu plucky/universe amd64 pybuild-plugin-autopkgtest all 6.20241217 [1746 B] 80s Get:59 http://ftpmaster.internal/ubuntu plucky/universe amd64 python-vcr-doc all 6.0.2-2 [184 kB] 80s Get:60 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-aiohappyeyeballs all 2.4.4-2 [10.6 kB] 80s Get:61 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-multidict amd64 6.1.0-1build1 [38.5 kB] 80s Get:62 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-yarl amd64 1.13.1-1build1 [127 kB] 80s Get:63 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-async-timeout all 5.0.1-1 [6830 B] 80s Get:64 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-frozenlist amd64 1.5.0-1build1 [67.8 kB] 80s Get:65 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-aiosignal all 1.3.2-1 [5182 B] 80s Get:66 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-aiohttp amd64 3.10.11-1 [374 kB] 80s Get:67 http://ftpmaster.internal/ubuntu plucky/main amd64 python3.13 amd64 3.13.1-3 [729 kB] 80s Get:68 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-all amd64 3.12.8-1 [890 B] 80s Get:69 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-brotli amd64 1.1.0-2build3 [368 kB] 80s Get:70 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-brotlicffi amd64 1.1.0.0+ds1-1 [18.5 kB] 80s Get:71 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-decorator all 5.1.1-5 [10.1 kB] 80s Get:72 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-defusedxml all 0.7.1-3 [42.2 kB] 80s Get:73 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-roman all 4.2-1 [10.0 kB] 80s Get:74 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-docutils all 0.21.2+dfsg-2 [409 kB] 80s Get:75 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-itsdangerous all 2.2.0-1 [15.2 kB] 80s Get:76 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-werkzeug all 3.1.3-2 [169 kB] 80s Get:77 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-flask all 3.1.0-2ubuntu1 [84.4 kB] 80s Get:78 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-mistune all 3.0.2-2 [32.9 kB] 80s Get:79 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-flasgger all 0.9.7.2~dev2+dfsg-3 [1693 kB] 80s Get:80 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-greenlet amd64 3.1.0-1 [183 kB] 80s Get:81 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-httpbin all 0.10.2+dfsg-2 [89.0 kB] 80s Get:82 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-imagesize all 1.4.1-1 [6844 B] 80s Get:83 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-iniconfig all 1.1.1-2 [6024 B] 80s Get:84 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-pluggy all 1.5.0-1 [21.0 kB] 80s Get:85 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-pytest all 8.3.4-1 [252 kB] 80s Get:86 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-pytest-httpbin all 2.1.0-1 [13.0 kB] 80s Get:87 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-tornado amd64 6.4.1-3 [299 kB] 80s Get:88 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-pytest-tornado all 0.8.1-3 [7180 B] 80s Get:89 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-snowballstemmer all 2.2.0-4build1 [59.8 kB] 80s Get:90 http://ftpmaster.internal/ubuntu plucky/main amd64 sphinx-common all 8.1.3-3 [661 kB] 80s Get:91 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-alabaster all 0.7.16-0.1 [18.5 kB] 80s Get:92 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-sphinx all 8.1.3-3 [474 kB] 80s Get:93 http://ftpmaster.internal/ubuntu plucky/main amd64 sphinx-rtd-theme-common all 3.0.2+dfsg-1 [1014 kB] 80s Get:94 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-sphinxcontrib.jquery all 4.1-5 [6678 B] 80s Get:95 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-sphinx-rtd-theme all 3.0.2+dfsg-1 [23.5 kB] 80s Get:96 http://ftpmaster.internal/ubuntu plucky/main amd64 python3-wrapt amd64 1.15.0-4 [34.8 kB] 80s Get:97 http://ftpmaster.internal/ubuntu plucky/universe amd64 python3-vcr all 6.0.2-2 [33.0 kB] 81s Fetched 115 MB in 6s (18.5 MB/s) 81s Selecting previously unselected package fonts-lato. 81s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 89455 files and directories currently installed.) 81s Preparing to unpack .../00-fonts-lato_2.015-1_all.deb ... 81s Unpacking fonts-lato (2.015-1) ... 81s Selecting previously unselected package libpython3.13-minimal:amd64. 81s Preparing to unpack .../01-libpython3.13-minimal_3.13.1-3_amd64.deb ... 81s Unpacking libpython3.13-minimal:amd64 (3.13.1-3) ... 81s Selecting previously unselected package python3.13-minimal. 81s Preparing to unpack .../02-python3.13-minimal_3.13.1-3_amd64.deb ... 81s Unpacking python3.13-minimal (3.13.1-3) ... 81s Selecting previously unselected package m4. 81s Preparing to unpack .../03-m4_1.4.19-5_amd64.deb ... 81s Unpacking m4 (1.4.19-5) ... 81s Selecting previously unselected package autoconf. 81s Preparing to unpack .../04-autoconf_2.72-3_all.deb ... 81s Unpacking autoconf (2.72-3) ... 81s Selecting previously unselected package autotools-dev. 81s Preparing to unpack .../05-autotools-dev_20220109.1_all.deb ... 81s Unpacking autotools-dev (20220109.1) ... 81s Selecting previously unselected package automake. 81s Preparing to unpack .../06-automake_1%3a1.16.5-1.3ubuntu1_all.deb ... 81s Unpacking automake (1:1.16.5-1.3ubuntu1) ... 81s Selecting previously unselected package autopoint. 81s Preparing to unpack .../07-autopoint_0.22.5-3_all.deb ... 81s Unpacking autopoint (0.22.5-3) ... 81s Selecting previously unselected package libisl23:amd64. 81s Preparing to unpack .../08-libisl23_0.27-1_amd64.deb ... 81s Unpacking libisl23:amd64 (0.27-1) ... 81s Selecting previously unselected package libmpc3:amd64. 81s Preparing to unpack .../09-libmpc3_1.3.1-1build2_amd64.deb ... 81s Unpacking libmpc3:amd64 (1.3.1-1build2) ... 81s Selecting previously unselected package cpp-14-x86-64-linux-gnu. 81s Preparing to unpack .../10-cpp-14-x86-64-linux-gnu_14.2.0-13ubuntu1_amd64.deb ... 81s Unpacking cpp-14-x86-64-linux-gnu (14.2.0-13ubuntu1) ... 81s Selecting previously unselected package cpp-14. 81s Preparing to unpack .../11-cpp-14_14.2.0-13ubuntu1_amd64.deb ... 81s Unpacking cpp-14 (14.2.0-13ubuntu1) ... 81s Selecting previously unselected package cpp-x86-64-linux-gnu. 81s Preparing to unpack .../12-cpp-x86-64-linux-gnu_4%3a14.1.0-2ubuntu1_amd64.deb ... 81s Unpacking cpp-x86-64-linux-gnu (4:14.1.0-2ubuntu1) ... 81s Selecting previously unselected package cpp. 81s Preparing to unpack .../13-cpp_4%3a14.1.0-2ubuntu1_amd64.deb ... 81s Unpacking cpp (4:14.1.0-2ubuntu1) ... 81s Selecting previously unselected package libcc1-0:amd64. 81s Preparing to unpack .../14-libcc1-0_14.2.0-13ubuntu1_amd64.deb ... 81s Unpacking libcc1-0:amd64 (14.2.0-13ubuntu1) ... 81s Selecting previously unselected package libgomp1:amd64. 81s Preparing to unpack .../15-libgomp1_14.2.0-13ubuntu1_amd64.deb ... 81s Unpacking libgomp1:amd64 (14.2.0-13ubuntu1) ... 81s Selecting previously unselected package libitm1:amd64. 81s Preparing to unpack .../16-libitm1_14.2.0-13ubuntu1_amd64.deb ... 81s Unpacking libitm1:amd64 (14.2.0-13ubuntu1) ... 81s Selecting previously unselected package libasan8:amd64. 81s Preparing to unpack .../17-libasan8_14.2.0-13ubuntu1_amd64.deb ... 81s Unpacking libasan8:amd64 (14.2.0-13ubuntu1) ... 82s Selecting previously unselected package liblsan0:amd64. 82s Preparing to unpack .../18-liblsan0_14.2.0-13ubuntu1_amd64.deb ... 82s Unpacking liblsan0:amd64 (14.2.0-13ubuntu1) ... 82s Selecting previously unselected package libtsan2:amd64. 82s Preparing to unpack .../19-libtsan2_14.2.0-13ubuntu1_amd64.deb ... 82s Unpacking libtsan2:amd64 (14.2.0-13ubuntu1) ... 82s Selecting previously unselected package libubsan1:amd64. 82s Preparing to unpack .../20-libubsan1_14.2.0-13ubuntu1_amd64.deb ... 82s Unpacking libubsan1:amd64 (14.2.0-13ubuntu1) ... 82s Selecting previously unselected package libhwasan0:amd64. 82s Preparing to unpack .../21-libhwasan0_14.2.0-13ubuntu1_amd64.deb ... 82s Unpacking libhwasan0:amd64 (14.2.0-13ubuntu1) ... 82s Selecting previously unselected package libquadmath0:amd64. 82s Preparing to unpack .../22-libquadmath0_14.2.0-13ubuntu1_amd64.deb ... 82s Unpacking libquadmath0:amd64 (14.2.0-13ubuntu1) ... 82s Selecting previously unselected package libgcc-14-dev:amd64. 82s Preparing to unpack .../23-libgcc-14-dev_14.2.0-13ubuntu1_amd64.deb ... 82s Unpacking libgcc-14-dev:amd64 (14.2.0-13ubuntu1) ... 82s Selecting previously unselected package gcc-14-x86-64-linux-gnu. 82s Preparing to unpack .../24-gcc-14-x86-64-linux-gnu_14.2.0-13ubuntu1_amd64.deb ... 82s Unpacking gcc-14-x86-64-linux-gnu (14.2.0-13ubuntu1) ... 82s Selecting previously unselected package gcc-14. 82s Preparing to unpack .../25-gcc-14_14.2.0-13ubuntu1_amd64.deb ... 82s Unpacking gcc-14 (14.2.0-13ubuntu1) ... 82s Selecting previously unselected package gcc-x86-64-linux-gnu. 82s Preparing to unpack .../26-gcc-x86-64-linux-gnu_4%3a14.1.0-2ubuntu1_amd64.deb ... 82s Unpacking gcc-x86-64-linux-gnu (4:14.1.0-2ubuntu1) ... 82s Selecting previously unselected package gcc. 82s Preparing to unpack .../27-gcc_4%3a14.1.0-2ubuntu1_amd64.deb ... 82s Unpacking gcc (4:14.1.0-2ubuntu1) ... 82s Selecting previously unselected package libstdc++-14-dev:amd64. 82s Preparing to unpack .../28-libstdc++-14-dev_14.2.0-13ubuntu1_amd64.deb ... 82s Unpacking libstdc++-14-dev:amd64 (14.2.0-13ubuntu1) ... 82s Selecting previously unselected package g++-14-x86-64-linux-gnu. 82s Preparing to unpack .../29-g++-14-x86-64-linux-gnu_14.2.0-13ubuntu1_amd64.deb ... 82s Unpacking g++-14-x86-64-linux-gnu (14.2.0-13ubuntu1) ... 83s Selecting previously unselected package g++-14. 83s Preparing to unpack .../30-g++-14_14.2.0-13ubuntu1_amd64.deb ... 83s Unpacking g++-14 (14.2.0-13ubuntu1) ... 83s Selecting previously unselected package g++-x86-64-linux-gnu. 83s Preparing to unpack .../31-g++-x86-64-linux-gnu_4%3a14.1.0-2ubuntu1_amd64.deb ... 83s Unpacking g++-x86-64-linux-gnu (4:14.1.0-2ubuntu1) ... 83s Selecting previously unselected package g++. 83s Preparing to unpack .../32-g++_4%3a14.1.0-2ubuntu1_amd64.deb ... 83s Unpacking g++ (4:14.1.0-2ubuntu1) ... 83s Selecting previously unselected package build-essential. 83s Preparing to unpack .../33-build-essential_12.10ubuntu1_amd64.deb ... 83s Unpacking build-essential (12.10ubuntu1) ... 83s Selecting previously unselected package libdebhelper-perl. 83s Preparing to unpack .../34-libdebhelper-perl_13.20ubuntu1_all.deb ... 83s Unpacking libdebhelper-perl (13.20ubuntu1) ... 83s Selecting previously unselected package libtool. 83s Preparing to unpack .../35-libtool_2.4.7-8_all.deb ... 83s Unpacking libtool (2.4.7-8) ... 83s Selecting previously unselected package dh-autoreconf. 83s Preparing to unpack .../36-dh-autoreconf_20_all.deb ... 83s Unpacking dh-autoreconf (20) ... 83s Selecting previously unselected package libarchive-zip-perl. 83s Preparing to unpack .../37-libarchive-zip-perl_1.68-1_all.deb ... 83s Unpacking libarchive-zip-perl (1.68-1) ... 83s Selecting previously unselected package libfile-stripnondeterminism-perl. 83s Preparing to unpack .../38-libfile-stripnondeterminism-perl_1.14.0-1_all.deb ... 83s Unpacking libfile-stripnondeterminism-perl (1.14.0-1) ... 83s Selecting previously unselected package dh-strip-nondeterminism. 83s Preparing to unpack .../39-dh-strip-nondeterminism_1.14.0-1_all.deb ... 83s Unpacking dh-strip-nondeterminism (1.14.0-1) ... 83s Selecting previously unselected package debugedit. 83s Preparing to unpack .../40-debugedit_1%3a5.1-1_amd64.deb ... 83s Unpacking debugedit (1:5.1-1) ... 83s Selecting previously unselected package dwz. 83s Preparing to unpack .../41-dwz_0.15-1build6_amd64.deb ... 83s Unpacking dwz (0.15-1build6) ... 83s Selecting previously unselected package gettext. 83s Preparing to unpack .../42-gettext_0.22.5-3_amd64.deb ... 83s Unpacking gettext (0.22.5-3) ... 83s Selecting previously unselected package intltool-debian. 83s Preparing to unpack .../43-intltool-debian_0.35.0+20060710.6_all.deb ... 83s Unpacking intltool-debian (0.35.0+20060710.6) ... 83s Selecting previously unselected package po-debconf. 83s Preparing to unpack .../44-po-debconf_1.0.21+nmu1_all.deb ... 83s Unpacking po-debconf (1.0.21+nmu1) ... 83s Selecting previously unselected package debhelper. 83s Preparing to unpack .../45-debhelper_13.20ubuntu1_all.deb ... 83s Unpacking debhelper (13.20ubuntu1) ... 83s Selecting previously unselected package dh-python. 83s Preparing to unpack .../46-dh-python_6.20241217_all.deb ... 83s Unpacking dh-python (6.20241217) ... 83s Selecting previously unselected package docutils-common. 83s Preparing to unpack .../47-docutils-common_0.21.2+dfsg-2_all.deb ... 83s Unpacking docutils-common (0.21.2+dfsg-2) ... 83s Selecting previously unselected package fonts-font-awesome. 83s Preparing to unpack .../48-fonts-font-awesome_5.0.10+really4.7.0~dfsg-4.1_all.deb ... 83s Unpacking fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 83s Selecting previously unselected package libjs-jquery. 83s Preparing to unpack .../49-libjs-jquery_3.6.1+dfsg+~3.5.14-1_all.deb ... 83s Unpacking libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 83s Selecting previously unselected package libjs-underscore. 83s Preparing to unpack .../50-libjs-underscore_1.13.4~dfsg+~1.11.4-3_all.deb ... 83s Unpacking libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 83s Selecting previously unselected package libjs-sphinxdoc. 83s Preparing to unpack .../51-libjs-sphinxdoc_8.1.3-3_all.deb ... 83s Unpacking libjs-sphinxdoc (8.1.3-3) ... 83s Selecting previously unselected package libjson-perl. 83s Preparing to unpack .../52-libjson-perl_4.10000-1_all.deb ... 83s Unpacking libjson-perl (4.10000-1) ... 83s Selecting previously unselected package liblua5.4-0:amd64. 83s Preparing to unpack .../53-liblua5.4-0_5.4.7-1_amd64.deb ... 83s Unpacking liblua5.4-0:amd64 (5.4.7-1) ... 83s Selecting previously unselected package libpython3.13-stdlib:amd64. 83s Preparing to unpack .../54-libpython3.13-stdlib_3.13.1-3_amd64.deb ... 83s Unpacking libpython3.13-stdlib:amd64 (3.13.1-3) ... 83s Selecting previously unselected package pandoc-data. 83s Preparing to unpack .../55-pandoc-data_3.1.11.1-3build1_all.deb ... 83s Unpacking pandoc-data (3.1.11.1-3build1) ... 83s Selecting previously unselected package pandoc. 83s Preparing to unpack .../56-pandoc_3.1.11.1+ds-2_amd64.deb ... 83s Unpacking pandoc (3.1.11.1+ds-2) ... 84s Selecting previously unselected package pybuild-plugin-autopkgtest. 84s Preparing to unpack .../57-pybuild-plugin-autopkgtest_6.20241217_all.deb ... 84s Unpacking pybuild-plugin-autopkgtest (6.20241217) ... 84s Selecting previously unselected package python-vcr-doc. 84s Preparing to unpack .../58-python-vcr-doc_6.0.2-2_all.deb ... 84s Unpacking python-vcr-doc (6.0.2-2) ... 84s Selecting previously unselected package python3-aiohappyeyeballs. 84s Preparing to unpack .../59-python3-aiohappyeyeballs_2.4.4-2_all.deb ... 84s Unpacking python3-aiohappyeyeballs (2.4.4-2) ... 84s Selecting previously unselected package python3-multidict. 84s Preparing to unpack .../60-python3-multidict_6.1.0-1build1_amd64.deb ... 84s Unpacking python3-multidict (6.1.0-1build1) ... 84s Selecting previously unselected package python3-yarl. 84s Preparing to unpack .../61-python3-yarl_1.13.1-1build1_amd64.deb ... 84s Unpacking python3-yarl (1.13.1-1build1) ... 84s Selecting previously unselected package python3-async-timeout. 84s Preparing to unpack .../62-python3-async-timeout_5.0.1-1_all.deb ... 84s Unpacking python3-async-timeout (5.0.1-1) ... 84s Selecting previously unselected package python3-frozenlist. 84s Preparing to unpack .../63-python3-frozenlist_1.5.0-1build1_amd64.deb ... 84s Unpacking python3-frozenlist (1.5.0-1build1) ... 84s Selecting previously unselected package python3-aiosignal. 84s Preparing to unpack .../64-python3-aiosignal_1.3.2-1_all.deb ... 84s Unpacking python3-aiosignal (1.3.2-1) ... 84s Selecting previously unselected package python3-aiohttp. 84s Preparing to unpack .../65-python3-aiohttp_3.10.11-1_amd64.deb ... 84s Unpacking python3-aiohttp (3.10.11-1) ... 84s Selecting previously unselected package python3.13. 84s Preparing to unpack .../66-python3.13_3.13.1-3_amd64.deb ... 84s Unpacking python3.13 (3.13.1-3) ... 84s Selecting previously unselected package python3-all. 84s Preparing to unpack .../67-python3-all_3.12.8-1_amd64.deb ... 84s Unpacking python3-all (3.12.8-1) ... 84s Selecting previously unselected package python3-brotli. 84s Preparing to unpack .../68-python3-brotli_1.1.0-2build3_amd64.deb ... 84s Unpacking python3-brotli (1.1.0-2build3) ... 84s Selecting previously unselected package python3-brotlicffi. 84s Preparing to unpack .../69-python3-brotlicffi_1.1.0.0+ds1-1_amd64.deb ... 84s Unpacking python3-brotlicffi (1.1.0.0+ds1-1) ... 84s Selecting previously unselected package python3-decorator. 84s Preparing to unpack .../70-python3-decorator_5.1.1-5_all.deb ... 84s Unpacking python3-decorator (5.1.1-5) ... 84s Selecting previously unselected package python3-defusedxml. 84s Preparing to unpack .../71-python3-defusedxml_0.7.1-3_all.deb ... 84s Unpacking python3-defusedxml (0.7.1-3) ... 84s Selecting previously unselected package python3-roman. 84s Preparing to unpack .../72-python3-roman_4.2-1_all.deb ... 84s Unpacking python3-roman (4.2-1) ... 84s Selecting previously unselected package python3-docutils. 84s Preparing to unpack .../73-python3-docutils_0.21.2+dfsg-2_all.deb ... 84s Unpacking python3-docutils (0.21.2+dfsg-2) ... 84s Selecting previously unselected package python3-itsdangerous. 84s Preparing to unpack .../74-python3-itsdangerous_2.2.0-1_all.deb ... 84s Unpacking python3-itsdangerous (2.2.0-1) ... 84s Selecting previously unselected package python3-werkzeug. 84s Preparing to unpack .../75-python3-werkzeug_3.1.3-2_all.deb ... 84s Unpacking python3-werkzeug (3.1.3-2) ... 85s Selecting previously unselected package python3-flask. 85s Preparing to unpack .../76-python3-flask_3.1.0-2ubuntu1_all.deb ... 85s Unpacking python3-flask (3.1.0-2ubuntu1) ... 85s Selecting previously unselected package python3-mistune. 85s Preparing to unpack .../77-python3-mistune_3.0.2-2_all.deb ... 85s Unpacking python3-mistune (3.0.2-2) ... 85s Selecting previously unselected package python3-flasgger. 85s Preparing to unpack .../78-python3-flasgger_0.9.7.2~dev2+dfsg-3_all.deb ... 85s Unpacking python3-flasgger (0.9.7.2~dev2+dfsg-3) ... 85s Selecting previously unselected package python3-greenlet. 85s Preparing to unpack .../79-python3-greenlet_3.1.0-1_amd64.deb ... 85s Unpacking python3-greenlet (3.1.0-1) ... 85s Selecting previously unselected package python3-httpbin. 85s Preparing to unpack .../80-python3-httpbin_0.10.2+dfsg-2_all.deb ... 85s Unpacking python3-httpbin (0.10.2+dfsg-2) ... 85s Selecting previously unselected package python3-imagesize. 85s Preparing to unpack .../81-python3-imagesize_1.4.1-1_all.deb ... 85s Unpacking python3-imagesize (1.4.1-1) ... 85s Selecting previously unselected package python3-iniconfig. 85s Preparing to unpack .../82-python3-iniconfig_1.1.1-2_all.deb ... 85s Unpacking python3-iniconfig (1.1.1-2) ... 85s Selecting previously unselected package python3-pluggy. 85s Preparing to unpack .../83-python3-pluggy_1.5.0-1_all.deb ... 85s Unpacking python3-pluggy (1.5.0-1) ... 85s Selecting previously unselected package python3-pytest. 85s Preparing to unpack .../84-python3-pytest_8.3.4-1_all.deb ... 85s Unpacking python3-pytest (8.3.4-1) ... 85s Selecting previously unselected package python3-pytest-httpbin. 85s Preparing to unpack .../85-python3-pytest-httpbin_2.1.0-1_all.deb ... 85s Unpacking python3-pytest-httpbin (2.1.0-1) ... 85s Selecting previously unselected package python3-tornado. 85s Preparing to unpack .../86-python3-tornado_6.4.1-3_amd64.deb ... 85s Unpacking python3-tornado (6.4.1-3) ... 85s Selecting previously unselected package python3-pytest-tornado. 85s Preparing to unpack .../87-python3-pytest-tornado_0.8.1-3_all.deb ... 85s Unpacking python3-pytest-tornado (0.8.1-3) ... 85s Selecting previously unselected package python3-snowballstemmer. 85s Preparing to unpack .../88-python3-snowballstemmer_2.2.0-4build1_all.deb ... 85s Unpacking python3-snowballstemmer (2.2.0-4build1) ... 85s Selecting previously unselected package sphinx-common. 85s Preparing to unpack .../89-sphinx-common_8.1.3-3_all.deb ... 85s Unpacking sphinx-common (8.1.3-3) ... 85s Selecting previously unselected package python3-alabaster. 85s Preparing to unpack .../90-python3-alabaster_0.7.16-0.1_all.deb ... 85s Unpacking python3-alabaster (0.7.16-0.1) ... 85s Selecting previously unselected package python3-sphinx. 85s Preparing to unpack .../91-python3-sphinx_8.1.3-3_all.deb ... 85s Unpacking python3-sphinx (8.1.3-3) ... 85s Selecting previously unselected package sphinx-rtd-theme-common. 85s Preparing to unpack .../92-sphinx-rtd-theme-common_3.0.2+dfsg-1_all.deb ... 85s Unpacking sphinx-rtd-theme-common (3.0.2+dfsg-1) ... 85s Selecting previously unselected package python3-sphinxcontrib.jquery. 85s Preparing to unpack .../93-python3-sphinxcontrib.jquery_4.1-5_all.deb ... 85s Unpacking python3-sphinxcontrib.jquery (4.1-5) ... 85s Selecting previously unselected package python3-sphinx-rtd-theme. 85s Preparing to unpack .../94-python3-sphinx-rtd-theme_3.0.2+dfsg-1_all.deb ... 85s Unpacking python3-sphinx-rtd-theme (3.0.2+dfsg-1) ... 85s Selecting previously unselected package python3-wrapt. 85s Preparing to unpack .../95-python3-wrapt_1.15.0-4_amd64.deb ... 85s Unpacking python3-wrapt (1.15.0-4) ... 85s Selecting previously unselected package python3-vcr. 85s Preparing to unpack .../96-python3-vcr_6.0.2-2_all.deb ... 85s Unpacking python3-vcr (6.0.2-2) ... 85s Setting up dh-python (6.20241217) ... 85s Setting up python3-iniconfig (1.1.1-2) ... 86s Setting up python3-tornado (6.4.1-3) ... 86s Setting up python3-brotlicffi (1.1.0.0+ds1-1) ... 86s Setting up fonts-lato (2.015-1) ... 86s Setting up python3-defusedxml (0.7.1-3) ... 86s Setting up libarchive-zip-perl (1.68-1) ... 86s Setting up python3-alabaster (0.7.16-0.1) ... 86s Setting up libdebhelper-perl (13.20ubuntu1) ... 86s Setting up m4 (1.4.19-5) ... 87s Setting up python3-itsdangerous (2.2.0-1) ... 87s Setting up libgomp1:amd64 (14.2.0-13ubuntu1) ... 87s Setting up python3-multidict (6.1.0-1build1) ... 87s Setting up python3-frozenlist (1.5.0-1build1) ... 87s Setting up python3-aiosignal (1.3.2-1) ... 87s Setting up python3-async-timeout (5.0.1-1) ... 87s Setting up libpython3.13-minimal:amd64 (3.13.1-3) ... 87s Setting up python3-roman (4.2-1) ... 87s Setting up python3-decorator (5.1.1-5) ... 87s Setting up autotools-dev (20220109.1) ... 87s Setting up python3-snowballstemmer (2.2.0-4build1) ... 88s Setting up python3-werkzeug (3.1.3-2) ... 88s Setting up python3-brotli (1.1.0-2build3) ... 88s Setting up python3-greenlet (3.1.0-1) ... 88s Setting up libquadmath0:amd64 (14.2.0-13ubuntu1) ... 88s Setting up libmpc3:amd64 (1.3.1-1build2) ... 89s Setting up python3-wrapt (1.15.0-4) ... 89s Setting up autopoint (0.22.5-3) ... 89s Setting up python3-aiohappyeyeballs (2.4.4-2) ... 89s Setting up autoconf (2.72-3) ... 89s Setting up python3-pluggy (1.5.0-1) ... 89s Setting up libubsan1:amd64 (14.2.0-13ubuntu1) ... 89s Setting up dwz (0.15-1build6) ... 89s Setting up libhwasan0:amd64 (14.2.0-13ubuntu1) ... 89s Setting up libasan8:amd64 (14.2.0-13ubuntu1) ... 89s Setting up docutils-common (0.21.2+dfsg-2) ... 89s Setting up libjson-perl (4.10000-1) ... 89s Setting up debugedit (1:5.1-1) ... 89s Setting up liblua5.4-0:amd64 (5.4.7-1) ... 89s Setting up python3.13-minimal (3.13.1-3) ... 90s Setting up pandoc-data (3.1.11.1-3build1) ... 90s Setting up libtsan2:amd64 (14.2.0-13ubuntu1) ... 90s Setting up libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 90s Setting up libisl23:amd64 (0.27-1) ... 90s Setting up python3-yarl (1.13.1-1build1) ... 90s Setting up python3-mistune (3.0.2-2) ... 90s Setting up libpython3.13-stdlib:amd64 (3.13.1-3) ... 90s Setting up fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 90s Setting up sphinx-rtd-theme-common (3.0.2+dfsg-1) ... 90s Setting up libcc1-0:amd64 (14.2.0-13ubuntu1) ... 90s Setting up liblsan0:amd64 (14.2.0-13ubuntu1) ... 90s Setting up libitm1:amd64 (14.2.0-13ubuntu1) ... 90s Setting up libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 90s Setting up python3-imagesize (1.4.1-1) ... 90s Setting up automake (1:1.16.5-1.3ubuntu1) ... 90s update-alternatives: using /usr/bin/automake-1.16 to provide /usr/bin/automake (automake) in auto mode 90s Setting up libfile-stripnondeterminism-perl (1.14.0-1) ... 90s Setting up gettext (0.22.5-3) ... 90s Setting up python3.13 (3.13.1-3) ... 91s Setting up python3-pytest (8.3.4-1) ... 91s Setting up python3-flask (3.1.0-2ubuntu1) ... 91s Setting up python3-aiohttp (3.10.11-1) ... 92s Setting up python3-all (3.12.8-1) ... 92s Setting up intltool-debian (0.35.0+20060710.6) ... 92s Setting up pandoc (3.1.11.1+ds-2) ... 92s Setting up python3-pytest-tornado (0.8.1-3) ... 92s Setting up cpp-14-x86-64-linux-gnu (14.2.0-13ubuntu1) ... 92s Setting up python3-vcr (6.0.2-2) ... 92s Setting up libjs-sphinxdoc (8.1.3-3) ... 92s Setting up cpp-14 (14.2.0-13ubuntu1) ... 92s Setting up dh-strip-nondeterminism (1.14.0-1) ... 92s Setting up libgcc-14-dev:amd64 (14.2.0-13ubuntu1) ... 92s Setting up libstdc++-14-dev:amd64 (14.2.0-13ubuntu1) ... 92s Setting up cpp-x86-64-linux-gnu (4:14.1.0-2ubuntu1) ... 92s Setting up python-vcr-doc (6.0.2-2) ... 92s Setting up python3-flasgger (0.9.7.2~dev2+dfsg-3) ... 92s Setting up po-debconf (1.0.21+nmu1) ... 92s Setting up sphinx-common (8.1.3-3) ... 92s Setting up python3-httpbin (0.10.2+dfsg-2) ... 93s Setting up cpp (4:14.1.0-2ubuntu1) ... 93s Setting up python3-pytest-httpbin (2.1.0-1) ... 93s Setting up gcc-14-x86-64-linux-gnu (14.2.0-13ubuntu1) ... 93s Setting up gcc-x86-64-linux-gnu (4:14.1.0-2ubuntu1) ... 93s Setting up gcc-14 (14.2.0-13ubuntu1) ... 93s Setting up g++-14-x86-64-linux-gnu (14.2.0-13ubuntu1) ... 93s Setting up g++-x86-64-linux-gnu (4:14.1.0-2ubuntu1) ... 93s Setting up g++-14 (14.2.0-13ubuntu1) ... 93s Setting up libtool (2.4.7-8) ... 93s Setting up gcc (4:14.1.0-2ubuntu1) ... 93s Setting up dh-autoreconf (20) ... 93s Setting up g++ (4:14.1.0-2ubuntu1) ... 93s update-alternatives: using /usr/bin/g++ to provide /usr/bin/c++ (c++) in auto mode 93s Setting up build-essential (12.10ubuntu1) ... 93s Setting up debhelper (13.20ubuntu1) ... 93s Setting up pybuild-plugin-autopkgtest (6.20241217) ... 93s Processing triggers for install-info (7.1.1-1) ... 93s Processing triggers for libc-bin (2.40-4ubuntu1) ... 93s Processing triggers for systemd (257-2ubuntu1) ... 93s Processing triggers for man-db (2.13.0-1) ... 94s Processing triggers for sgml-base (1.31) ... 94s Setting up python3-docutils (0.21.2+dfsg-2) ... 95s Setting up python3-sphinx (8.1.3-3) ... 96s Setting up python3-sphinxcontrib.jquery (4.1-5) ... 96s Setting up python3-sphinx-rtd-theme (3.0.2+dfsg-1) ... 96s 96s Running kernel seems to be up-to-date. 96s 96s Restarting services... 96s 96s Service restarts being deferred: 96s systemctl restart systemd-logind.service 96s 96s No containers need to be restarted. 96s 96s User sessions running outdated binaries: 96s ubuntu @ session #4: sshd-session[1199] 96s ubuntu @ user manager service: systemd[956] 96s 96s No VM guests are running outdated hypervisor (qemu) binaries on this host. 98s autopkgtest [20:23:14]: test pybuild-autopkgtest: pybuild-autopkgtest 98s autopkgtest [20:23:14]: test pybuild-autopkgtest: [----------------------- 99s pybuild-autopkgtest 99s I: pybuild base:311: cd /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build; python3.13 -m pytest --ignore tests/integration/test_aiohttp.py --ignore tests/integration/test_tornado.py --ignore tests/integration/test_requests.py -m "not online" -k "not test_basic_json_use and not test_load_cassette_with_custom_persister" 101s ============================= test session starts ============================== 101s platform linux -- Python 3.13.1, pytest-8.3.4, pluggy-1.5.0 101s rootdir: /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build 101s plugins: httpbin-2.1.0, tornado-0.8.1, typeguard-4.4.1 101s collected 305 items / 19 deselected / 1 skipped / 286 selected 101s 101s tests/integration/test_basic.py .... [ 1%] 101s tests/integration/test_boto3.py ss [ 2%] 101s tests/integration/test_config.py . [ 2%] 101s tests/integration/test_filter.py .......... [ 5%] 101s tests/integration/test_httplib2.py ........ [ 8%] 101s tests/integration/test_urllib2.py ........ [ 11%] 101s tests/integration/test_urllib3.py FFFFFFF [ 13%] 101s tests/integration/test_httplib2.py ........ [ 16%] 101s tests/integration/test_urllib2.py ........ [ 19%] 102s tests/integration/test_urllib3.py FFFFFFF [ 22%] 102s tests/integration/test_httplib2.py . [ 22%] 102s tests/integration/test_ignore.py .... [ 23%] 102s tests/integration/test_matchers.py .............. [ 28%] 102s tests/integration/test_multiple.py . [ 29%] 102s tests/integration/test_proxy.py F [ 29%] 102s tests/integration/test_record_mode.py ........ [ 32%] 102s tests/integration/test_register_persister.py .. [ 32%] 102s tests/integration/test_register_serializer.py . [ 33%] 102s tests/integration/test_request.py .. [ 33%] 102s tests/integration/test_stubs.py .... [ 35%] 102s tests/integration/test_urllib2.py . [ 35%] 102s tests/integration/test_urllib3.py FF. [ 36%] 102s tests/integration/test_wild.py F.F. [ 38%] 102s tests/unit/test_cassettes.py ............................... [ 48%] 102s tests/unit/test_errors.py .... [ 50%] 102s tests/unit/test_filters.py ........................ [ 58%] 102s tests/unit/test_json_serializer.py . [ 59%] 102s tests/unit/test_matchers.py ............................ [ 68%] 102s tests/unit/test_migration.py ... [ 69%] 102s tests/unit/test_persist.py .... [ 71%] 102s tests/unit/test_request.py ................. [ 77%] 102s tests/unit/test_response.py .... [ 78%] 102s tests/unit/test_serialize.py ............... [ 83%] 102s tests/unit/test_stubs.py .F. [ 84%] 102s tests/unit/test_unittest.py ....... [ 87%] 102s tests/unit/test_util.py ........... [ 91%] 102s tests/unit/test_vcr.py ........................ [ 99%] 103s tests/unit/test_vcr_import.py . [100%] 103s 103s =================================== FAILURES =================================== 103s ____________________________ test_status_code[http] ____________________________ 103s 103s httpbin_both = 103s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_status_code_http_0') 103s verify_pool_mgr = 103s 103s def test_status_code(httpbin_both, tmpdir, verify_pool_mgr): 103s """Ensure that we can read the status code""" 103s url = httpbin_both.url 103s with vcr.use_cassette(str(tmpdir.join("atts.yaml"))): 103s > status_code = verify_pool_mgr.request("GET", url).status 103s 103s tests/integration/test_urllib3.py:34: 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 103s return self.request_encode_url( 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 103s return self.urlopen(method, url, **extra_kw) 103s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 103s response = conn.urlopen(method, u.request_uri, **kw) 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 103s response = self._make_request( 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s 103s self = 103s conn = 103s method = 'GET', url = '/', body = None, headers = {} 103s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 103s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 103s chunked = False, response_conn = None, preload_content = True 103s decode_content = True, enforce_content_length = True 103s 103s def _make_request( 103s self, 103s conn: BaseHTTPConnection, 103s method: str, 103s url: str, 103s body: _TYPE_BODY | None = None, 103s headers: typing.Mapping[str, str] | None = None, 103s retries: Retry | None = None, 103s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 103s chunked: bool = False, 103s response_conn: BaseHTTPConnection | None = None, 103s preload_content: bool = True, 103s decode_content: bool = True, 103s enforce_content_length: bool = True, 103s ) -> BaseHTTPResponse: 103s """ 103s Perform a request on a given urllib connection object taken from our 103s pool. 103s 103s :param conn: 103s a connection from one of our connection pools 103s 103s :param method: 103s HTTP request method (such as GET, POST, PUT, etc.) 103s 103s :param url: 103s The URL to perform the request on. 103s 103s :param body: 103s Data to send in the request body, either :class:`str`, :class:`bytes`, 103s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 103s 103s :param headers: 103s Dictionary of custom headers to send, such as User-Agent, 103s If-None-Match, etc. If None, pool headers are used. If provided, 103s these headers completely replace any pool-specific headers. 103s 103s :param retries: 103s Configure the number of retries to allow before raising a 103s :class:`~urllib3.exceptions.MaxRetryError` exception. 103s 103s Pass ``None`` to retry until you receive a response. Pass a 103s :class:`~urllib3.util.retry.Retry` object for fine-grained control 103s over different types of retries. 103s Pass an integer number to retry connection errors that many times, 103s but no other types of errors. Pass zero to never retry. 103s 103s If ``False``, then retries are disabled and any exception is raised 103s immediately. Also, instead of raising a MaxRetryError on redirects, 103s the redirect response will be returned. 103s 103s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 103s 103s :param timeout: 103s If specified, overrides the default timeout for this one 103s request. It may be a float (in seconds) or an instance of 103s :class:`urllib3.util.Timeout`. 103s 103s :param chunked: 103s If True, urllib3 will send the body using chunked transfer 103s encoding. Otherwise, urllib3 will send the body using the standard 103s content-length form. Defaults to False. 103s 103s :param response_conn: 103s Set this to ``None`` if you will handle releasing the connection or 103s set the connection to have the response release it. 103s 103s :param preload_content: 103s If True, the response's body will be preloaded during construction. 103s 103s :param decode_content: 103s If True, will attempt to decode the body based on the 103s 'content-encoding' header. 103s 103s :param enforce_content_length: 103s Enforce content length checking. Body returned by server must match 103s value of Content-Length header, if present. Otherwise, raise error. 103s """ 103s self.num_requests += 1 103s 103s timeout_obj = self._get_timeout(timeout) 103s timeout_obj.start_connect() 103s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 103s 103s try: 103s # Trigger any extra validation we need to do. 103s try: 103s self._validate_conn(conn) 103s except (SocketTimeout, BaseSSLError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 103s raise 103s 103s # _validate_conn() starts the connection to an HTTPS proxy 103s # so we need to wrap errors with 'ProxyError' here too. 103s except ( 103s OSError, 103s NewConnectionError, 103s TimeoutError, 103s BaseSSLError, 103s CertificateError, 103s SSLError, 103s ) as e: 103s new_e: Exception = e 103s if isinstance(e, (BaseSSLError, CertificateError)): 103s new_e = SSLError(e) 103s # If the connection didn't successfully connect to it's proxy 103s # then there 103s if isinstance( 103s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 103s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 103s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 103s raise new_e 103s 103s # conn.request() calls http.client.*.request, not the method in 103s # urllib3.request. It also calls makefile (recv) on the socket. 103s try: 103s conn.request( 103s method, 103s url, 103s body=body, 103s headers=headers, 103s chunked=chunked, 103s preload_content=preload_content, 103s decode_content=decode_content, 103s enforce_content_length=enforce_content_length, 103s ) 103s 103s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 103s # legitimately able to close the connection after sending a valid response. 103s # With this behaviour, the received response is still readable. 103s except BrokenPipeError: 103s pass 103s except OSError as e: 103s # MacOS/Linux 103s # EPROTOTYPE and ECONNRESET are needed on macOS 103s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 103s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 103s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 103s raise 103s 103s # Reset the timeout for the recv() on the socket 103s read_timeout = timeout_obj.read_timeout 103s 103s if not conn.is_closed: 103s # In Python 3 socket.py will catch EAGAIN and return None when you 103s # try and read into the file pointer created by http.client, which 103s # instead raises a BadStatusLine exception. Instead of catching 103s # the exception and assuming all BadStatusLine exceptions are read 103s # timeouts, check for a zero timeout before making the request. 103s if read_timeout == 0: 103s raise ReadTimeoutError( 103s self, url, f"Read timed out. (read timeout={read_timeout})" 103s ) 103s conn.timeout = read_timeout 103s 103s # Receive the response from the server 103s try: 103s response = conn.getresponse() 103s except (BaseSSLError, OSError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 103s raise 103s 103s # Set properties that are used by the pooling layer. 103s response.retries = retries 103s response._connection = response_conn # type: ignore[attr-defined] 103s response._pool = self # type: ignore[attr-defined] 103s 103s log.debug( 103s '%s://%s:%s "%s %s %s" %s %s', 103s self.scheme, 103s self.host, 103s self.port, 103s method, 103s url, 103s > response.version_string, 103s response.status, 103s response.length_remaining, 103s ) 103s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 103s 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 103s ----------------------------- Captured stderr call ----------------------------- 103s 127.0.0.1 - - [19/Jan/2025 20:23:17] "GET / HTTP/1.1" 200 9358 103s ______________________________ test_headers[http] ______________________________ 103s 103s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_headers_http_0') 103s httpbin_both = 103s verify_pool_mgr = 103s 103s def test_headers(tmpdir, httpbin_both, verify_pool_mgr): 103s """Ensure that we can read the headers back""" 103s url = httpbin_both.url 103s with vcr.use_cassette(str(tmpdir.join("headers.yaml"))): 103s > headers = verify_pool_mgr.request("GET", url).headers 103s 103s tests/integration/test_urllib3.py:44: 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 103s return self.request_encode_url( 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 103s return self.urlopen(method, url, **extra_kw) 103s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 103s response = conn.urlopen(method, u.request_uri, **kw) 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 103s response = self._make_request( 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s 103s self = 103s conn = 103s method = 'GET', url = '/', body = None, headers = {} 103s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 103s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 103s chunked = False, response_conn = None, preload_content = True 103s decode_content = True, enforce_content_length = True 103s 103s def _make_request( 103s self, 103s conn: BaseHTTPConnection, 103s method: str, 103s url: str, 103s body: _TYPE_BODY | None = None, 103s headers: typing.Mapping[str, str] | None = None, 103s retries: Retry | None = None, 103s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 103s chunked: bool = False, 103s response_conn: BaseHTTPConnection | None = None, 103s preload_content: bool = True, 103s decode_content: bool = True, 103s enforce_content_length: bool = True, 103s ) -> BaseHTTPResponse: 103s """ 103s Perform a request on a given urllib connection object taken from our 103s pool. 103s 103s :param conn: 103s a connection from one of our connection pools 103s 103s :param method: 103s HTTP request method (such as GET, POST, PUT, etc.) 103s 103s :param url: 103s The URL to perform the request on. 103s 103s :param body: 103s Data to send in the request body, either :class:`str`, :class:`bytes`, 103s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 103s 103s :param headers: 103s Dictionary of custom headers to send, such as User-Agent, 103s If-None-Match, etc. If None, pool headers are used. If provided, 103s these headers completely replace any pool-specific headers. 103s 103s :param retries: 103s Configure the number of retries to allow before raising a 103s :class:`~urllib3.exceptions.MaxRetryError` exception. 103s 103s Pass ``None`` to retry until you receive a response. Pass a 103s :class:`~urllib3.util.retry.Retry` object for fine-grained control 103s over different types of retries. 103s Pass an integer number to retry connection errors that many times, 103s but no other types of errors. Pass zero to never retry. 103s 103s If ``False``, then retries are disabled and any exception is raised 103s immediately. Also, instead of raising a MaxRetryError on redirects, 103s the redirect response will be returned. 103s 103s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 103s 103s :param timeout: 103s If specified, overrides the default timeout for this one 103s request. It may be a float (in seconds) or an instance of 103s :class:`urllib3.util.Timeout`. 103s 103s :param chunked: 103s If True, urllib3 will send the body using chunked transfer 103s encoding. Otherwise, urllib3 will send the body using the standard 103s content-length form. Defaults to False. 103s 103s :param response_conn: 103s Set this to ``None`` if you will handle releasing the connection or 103s set the connection to have the response release it. 103s 103s :param preload_content: 103s If True, the response's body will be preloaded during construction. 103s 103s :param decode_content: 103s If True, will attempt to decode the body based on the 103s 'content-encoding' header. 103s 103s :param enforce_content_length: 103s Enforce content length checking. Body returned by server must match 103s value of Content-Length header, if present. Otherwise, raise error. 103s """ 103s self.num_requests += 1 103s 103s timeout_obj = self._get_timeout(timeout) 103s timeout_obj.start_connect() 103s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 103s 103s try: 103s # Trigger any extra validation we need to do. 103s try: 103s self._validate_conn(conn) 103s except (SocketTimeout, BaseSSLError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 103s raise 103s 103s # _validate_conn() starts the connection to an HTTPS proxy 103s # so we need to wrap errors with 'ProxyError' here too. 103s except ( 103s OSError, 103s NewConnectionError, 103s TimeoutError, 103s BaseSSLError, 103s CertificateError, 103s SSLError, 103s ) as e: 103s new_e: Exception = e 103s if isinstance(e, (BaseSSLError, CertificateError)): 103s new_e = SSLError(e) 103s # If the connection didn't successfully connect to it's proxy 103s # then there 103s if isinstance( 103s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 103s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 103s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 103s raise new_e 103s 103s # conn.request() calls http.client.*.request, not the method in 103s # urllib3.request. It also calls makefile (recv) on the socket. 103s try: 103s conn.request( 103s method, 103s url, 103s body=body, 103s headers=headers, 103s chunked=chunked, 103s preload_content=preload_content, 103s decode_content=decode_content, 103s enforce_content_length=enforce_content_length, 103s ) 103s 103s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 103s # legitimately able to close the connection after sending a valid response. 103s # With this behaviour, the received response is still readable. 103s except BrokenPipeError: 103s pass 103s except OSError as e: 103s # MacOS/Linux 103s # EPROTOTYPE and ECONNRESET are needed on macOS 103s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 103s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 103s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 103s raise 103s 103s # Reset the timeout for the recv() on the socket 103s read_timeout = timeout_obj.read_timeout 103s 103s if not conn.is_closed: 103s # In Python 3 socket.py will catch EAGAIN and return None when you 103s # try and read into the file pointer created by http.client, which 103s # instead raises a BadStatusLine exception. Instead of catching 103s # the exception and assuming all BadStatusLine exceptions are read 103s # timeouts, check for a zero timeout before making the request. 103s if read_timeout == 0: 103s raise ReadTimeoutError( 103s self, url, f"Read timed out. (read timeout={read_timeout})" 103s ) 103s conn.timeout = read_timeout 103s 103s # Receive the response from the server 103s try: 103s response = conn.getresponse() 103s except (BaseSSLError, OSError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 103s raise 103s 103s # Set properties that are used by the pooling layer. 103s response.retries = retries 103s response._connection = response_conn # type: ignore[attr-defined] 103s response._pool = self # type: ignore[attr-defined] 103s 103s log.debug( 103s '%s://%s:%s "%s %s %s" %s %s', 103s self.scheme, 103s self.host, 103s self.port, 103s method, 103s url, 103s > response.version_string, 103s response.status, 103s response.length_remaining, 103s ) 103s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 103s 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 103s ----------------------------- Captured stderr call ----------------------------- 103s 127.0.0.1 - - [19/Jan/2025 20:23:17] "GET / HTTP/1.1" 200 9358 103s _______________________________ test_body[http] ________________________________ 103s 103s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_body_http_0') 103s httpbin_both = 103s verify_pool_mgr = 103s 103s def test_body(tmpdir, httpbin_both, verify_pool_mgr): 103s """Ensure the responses are all identical enough""" 103s url = httpbin_both.url + "/bytes/1024" 103s with vcr.use_cassette(str(tmpdir.join("body.yaml"))): 103s > content = verify_pool_mgr.request("GET", url).data 103s 103s tests/integration/test_urllib3.py:55: 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 103s return self.request_encode_url( 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 103s return self.urlopen(method, url, **extra_kw) 103s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 103s response = conn.urlopen(method, u.request_uri, **kw) 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 103s response = self._make_request( 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s 103s self = 103s conn = 103s method = 'GET', url = '/bytes/1024', body = None, headers = {} 103s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 103s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 103s chunked = False, response_conn = None, preload_content = True 103s decode_content = True, enforce_content_length = True 103s 103s def _make_request( 103s self, 103s conn: BaseHTTPConnection, 103s method: str, 103s url: str, 103s body: _TYPE_BODY | None = None, 103s headers: typing.Mapping[str, str] | None = None, 103s retries: Retry | None = None, 103s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 103s chunked: bool = False, 103s response_conn: BaseHTTPConnection | None = None, 103s preload_content: bool = True, 103s decode_content: bool = True, 103s enforce_content_length: bool = True, 103s ) -> BaseHTTPResponse: 103s """ 103s Perform a request on a given urllib connection object taken from our 103s pool. 103s 103s :param conn: 103s a connection from one of our connection pools 103s 103s :param method: 103s HTTP request method (such as GET, POST, PUT, etc.) 103s 103s :param url: 103s The URL to perform the request on. 103s 103s :param body: 103s Data to send in the request body, either :class:`str`, :class:`bytes`, 103s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 103s 103s :param headers: 103s Dictionary of custom headers to send, such as User-Agent, 103s If-None-Match, etc. If None, pool headers are used. If provided, 103s these headers completely replace any pool-specific headers. 103s 103s :param retries: 103s Configure the number of retries to allow before raising a 103s :class:`~urllib3.exceptions.MaxRetryError` exception. 103s 103s Pass ``None`` to retry until you receive a response. Pass a 103s :class:`~urllib3.util.retry.Retry` object for fine-grained control 103s over different types of retries. 103s Pass an integer number to retry connection errors that many times, 103s but no other types of errors. Pass zero to never retry. 103s 103s If ``False``, then retries are disabled and any exception is raised 103s immediately. Also, instead of raising a MaxRetryError on redirects, 103s the redirect response will be returned. 103s 103s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 103s 103s :param timeout: 103s If specified, overrides the default timeout for this one 103s request. It may be a float (in seconds) or an instance of 103s :class:`urllib3.util.Timeout`. 103s 103s :param chunked: 103s If True, urllib3 will send the body using chunked transfer 103s encoding. Otherwise, urllib3 will send the body using the standard 103s content-length form. Defaults to False. 103s 103s :param response_conn: 103s Set this to ``None`` if you will handle releasing the connection or 103s set the connection to have the response release it. 103s 103s :param preload_content: 103s If True, the response's body will be preloaded during construction. 103s 103s :param decode_content: 103s If True, will attempt to decode the body based on the 103s 'content-encoding' header. 103s 103s :param enforce_content_length: 103s Enforce content length checking. Body returned by server must match 103s value of Content-Length header, if present. Otherwise, raise error. 103s """ 103s self.num_requests += 1 103s 103s timeout_obj = self._get_timeout(timeout) 103s timeout_obj.start_connect() 103s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 103s 103s try: 103s # Trigger any extra validation we need to do. 103s try: 103s self._validate_conn(conn) 103s except (SocketTimeout, BaseSSLError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 103s raise 103s 103s # _validate_conn() starts the connection to an HTTPS proxy 103s # so we need to wrap errors with 'ProxyError' here too. 103s except ( 103s OSError, 103s NewConnectionError, 103s TimeoutError, 103s BaseSSLError, 103s CertificateError, 103s SSLError, 103s ) as e: 103s new_e: Exception = e 103s if isinstance(e, (BaseSSLError, CertificateError)): 103s new_e = SSLError(e) 103s # If the connection didn't successfully connect to it's proxy 103s # then there 103s if isinstance( 103s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 103s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 103s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 103s raise new_e 103s 103s # conn.request() calls http.client.*.request, not the method in 103s # urllib3.request. It also calls makefile (recv) on the socket. 103s try: 103s conn.request( 103s method, 103s url, 103s body=body, 103s headers=headers, 103s chunked=chunked, 103s preload_content=preload_content, 103s decode_content=decode_content, 103s enforce_content_length=enforce_content_length, 103s ) 103s 103s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 103s # legitimately able to close the connection after sending a valid response. 103s # With this behaviour, the received response is still readable. 103s except BrokenPipeError: 103s pass 103s except OSError as e: 103s # MacOS/Linux 103s # EPROTOTYPE and ECONNRESET are needed on macOS 103s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 103s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 103s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 103s raise 103s 103s # Reset the timeout for the recv() on the socket 103s read_timeout = timeout_obj.read_timeout 103s 103s if not conn.is_closed: 103s # In Python 3 socket.py will catch EAGAIN and return None when you 103s # try and read into the file pointer created by http.client, which 103s # instead raises a BadStatusLine exception. Instead of catching 103s # the exception and assuming all BadStatusLine exceptions are read 103s # timeouts, check for a zero timeout before making the request. 103s if read_timeout == 0: 103s raise ReadTimeoutError( 103s self, url, f"Read timed out. (read timeout={read_timeout})" 103s ) 103s conn.timeout = read_timeout 103s 103s # Receive the response from the server 103s try: 103s response = conn.getresponse() 103s except (BaseSSLError, OSError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 103s raise 103s 103s # Set properties that are used by the pooling layer. 103s response.retries = retries 103s response._connection = response_conn # type: ignore[attr-defined] 103s response._pool = self # type: ignore[attr-defined] 103s 103s log.debug( 103s '%s://%s:%s "%s %s %s" %s %s', 103s self.scheme, 103s self.host, 103s self.port, 103s method, 103s url, 103s > response.version_string, 103s response.status, 103s response.length_remaining, 103s ) 103s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 103s 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 103s ----------------------------- Captured stderr call ----------------------------- 103s 127.0.0.1 - - [19/Jan/2025 20:23:17] "GET /bytes/1024 HTTP/1.1" 200 1024 103s _______________________________ test_auth[http] ________________________________ 103s 103s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_auth_http_0') 103s httpbin_both = 103s verify_pool_mgr = 103s 103s def test_auth(tmpdir, httpbin_both, verify_pool_mgr): 103s """Ensure that we can handle basic auth""" 103s auth = ("user", "passwd") 103s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 103s url = httpbin_both.url + "/basic-auth/user/passwd" 103s with vcr.use_cassette(str(tmpdir.join("auth.yaml"))): 103s > one = verify_pool_mgr.request("GET", url, headers=headers) 103s 103s tests/integration/test_urllib3.py:67: 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 103s return self.request_encode_url( 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 103s return self.urlopen(method, url, **extra_kw) 103s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 103s response = conn.urlopen(method, u.request_uri, **kw) 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 103s response = self._make_request( 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s 103s self = 103s conn = 103s method = 'GET', url = '/basic-auth/user/passwd', body = None 103s headers = {'authorization': 'Basic dXNlcjpwYXNzd2Q='} 103s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 103s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 103s chunked = False, response_conn = None, preload_content = True 103s decode_content = True, enforce_content_length = True 103s 103s def _make_request( 103s self, 103s conn: BaseHTTPConnection, 103s method: str, 103s url: str, 103s body: _TYPE_BODY | None = None, 103s headers: typing.Mapping[str, str] | None = None, 103s retries: Retry | None = None, 103s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 103s chunked: bool = False, 103s response_conn: BaseHTTPConnection | None = None, 103s preload_content: bool = True, 103s decode_content: bool = True, 103s enforce_content_length: bool = True, 103s ) -> BaseHTTPResponse: 103s """ 103s Perform a request on a given urllib connection object taken from our 103s pool. 103s 103s :param conn: 103s a connection from one of our connection pools 103s 103s :param method: 103s HTTP request method (such as GET, POST, PUT, etc.) 103s 103s :param url: 103s The URL to perform the request on. 103s 103s :param body: 103s Data to send in the request body, either :class:`str`, :class:`bytes`, 103s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 103s 103s :param headers: 103s Dictionary of custom headers to send, such as User-Agent, 103s If-None-Match, etc. If None, pool headers are used. If provided, 103s these headers completely replace any pool-specific headers. 103s 103s :param retries: 103s Configure the number of retries to allow before raising a 103s :class:`~urllib3.exceptions.MaxRetryError` exception. 103s 103s Pass ``None`` to retry until you receive a response. Pass a 103s :class:`~urllib3.util.retry.Retry` object for fine-grained control 103s over different types of retries. 103s Pass an integer number to retry connection errors that many times, 103s but no other types of errors. Pass zero to never retry. 103s 103s If ``False``, then retries are disabled and any exception is raised 103s immediately. Also, instead of raising a MaxRetryError on redirects, 103s the redirect response will be returned. 103s 103s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 103s 103s :param timeout: 103s If specified, overrides the default timeout for this one 103s request. It may be a float (in seconds) or an instance of 103s :class:`urllib3.util.Timeout`. 103s 103s :param chunked: 103s If True, urllib3 will send the body using chunked transfer 103s encoding. Otherwise, urllib3 will send the body using the standard 103s content-length form. Defaults to False. 103s 103s :param response_conn: 103s Set this to ``None`` if you will handle releasing the connection or 103s set the connection to have the response release it. 103s 103s :param preload_content: 103s If True, the response's body will be preloaded during construction. 103s 103s :param decode_content: 103s If True, will attempt to decode the body based on the 103s 'content-encoding' header. 103s 103s :param enforce_content_length: 103s Enforce content length checking. Body returned by server must match 103s value of Content-Length header, if present. Otherwise, raise error. 103s """ 103s self.num_requests += 1 103s 103s timeout_obj = self._get_timeout(timeout) 103s timeout_obj.start_connect() 103s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 103s 103s try: 103s # Trigger any extra validation we need to do. 103s try: 103s self._validate_conn(conn) 103s except (SocketTimeout, BaseSSLError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 103s raise 103s 103s # _validate_conn() starts the connection to an HTTPS proxy 103s # so we need to wrap errors with 'ProxyError' here too. 103s except ( 103s OSError, 103s NewConnectionError, 103s TimeoutError, 103s BaseSSLError, 103s CertificateError, 103s SSLError, 103s ) as e: 103s new_e: Exception = e 103s if isinstance(e, (BaseSSLError, CertificateError)): 103s new_e = SSLError(e) 103s # If the connection didn't successfully connect to it's proxy 103s # then there 103s if isinstance( 103s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 103s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 103s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 103s raise new_e 103s 103s # conn.request() calls http.client.*.request, not the method in 103s # urllib3.request. It also calls makefile (recv) on the socket. 103s try: 103s conn.request( 103s method, 103s url, 103s body=body, 103s headers=headers, 103s chunked=chunked, 103s preload_content=preload_content, 103s decode_content=decode_content, 103s enforce_content_length=enforce_content_length, 103s ) 103s 103s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 103s # legitimately able to close the connection after sending a valid response. 103s # With this behaviour, the received response is still readable. 103s except BrokenPipeError: 103s pass 103s except OSError as e: 103s # MacOS/Linux 103s # EPROTOTYPE and ECONNRESET are needed on macOS 103s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 103s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 103s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 103s raise 103s 103s # Reset the timeout for the recv() on the socket 103s read_timeout = timeout_obj.read_timeout 103s 103s if not conn.is_closed: 103s # In Python 3 socket.py will catch EAGAIN and return None when you 103s # try and read into the file pointer created by http.client, which 103s # instead raises a BadStatusLine exception. Instead of catching 103s # the exception and assuming all BadStatusLine exceptions are read 103s # timeouts, check for a zero timeout before making the request. 103s if read_timeout == 0: 103s raise ReadTimeoutError( 103s self, url, f"Read timed out. (read timeout={read_timeout})" 103s ) 103s conn.timeout = read_timeout 103s 103s # Receive the response from the server 103s try: 103s response = conn.getresponse() 103s except (BaseSSLError, OSError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 103s raise 103s 103s # Set properties that are used by the pooling layer. 103s response.retries = retries 103s response._connection = response_conn # type: ignore[attr-defined] 103s response._pool = self # type: ignore[attr-defined] 103s 103s log.debug( 103s '%s://%s:%s "%s %s %s" %s %s', 103s self.scheme, 103s self.host, 103s self.port, 103s method, 103s url, 103s > response.version_string, 103s response.status, 103s response.length_remaining, 103s ) 103s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 103s 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 103s ----------------------------- Captured stderr call ----------------------------- 103s 127.0.0.1 - - [19/Jan/2025 20:23:17] "GET /basic-auth/user/passwd HTTP/1.1" 200 46 103s ____________________________ test_auth_failed[http] ____________________________ 103s 103s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_auth_failed_http_0') 103s httpbin_both = 103s verify_pool_mgr = 103s 103s def test_auth_failed(tmpdir, httpbin_both, verify_pool_mgr): 103s """Ensure that we can save failed auth statuses""" 103s auth = ("user", "wrongwrongwrong") 103s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 103s url = httpbin_both.url + "/basic-auth/user/passwd" 103s with vcr.use_cassette(str(tmpdir.join("auth-failed.yaml"))) as cass: 103s # Ensure that this is empty to begin with 103s assert_cassette_empty(cass) 103s > one = verify_pool_mgr.request("GET", url, headers=headers) 103s 103s tests/integration/test_urllib3.py:83: 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 103s return self.request_encode_url( 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 103s return self.urlopen(method, url, **extra_kw) 103s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 103s response = conn.urlopen(method, u.request_uri, **kw) 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 103s response = self._make_request( 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s 103s self = 103s conn = 103s method = 'GET', url = '/basic-auth/user/passwd', body = None 103s headers = {'authorization': 'Basic dXNlcjp3cm9uZ3dyb25nd3Jvbmc='} 103s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 103s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 103s chunked = False, response_conn = None, preload_content = True 103s decode_content = True, enforce_content_length = True 103s 103s def _make_request( 103s self, 103s conn: BaseHTTPConnection, 103s method: str, 103s url: str, 103s body: _TYPE_BODY | None = None, 103s headers: typing.Mapping[str, str] | None = None, 103s retries: Retry | None = None, 103s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 103s chunked: bool = False, 103s response_conn: BaseHTTPConnection | None = None, 103s preload_content: bool = True, 103s decode_content: bool = True, 103s enforce_content_length: bool = True, 103s ) -> BaseHTTPResponse: 103s """ 103s Perform a request on a given urllib connection object taken from our 103s pool. 103s 103s :param conn: 103s a connection from one of our connection pools 103s 103s :param method: 103s HTTP request method (such as GET, POST, PUT, etc.) 103s 103s :param url: 103s The URL to perform the request on. 103s 103s :param body: 103s Data to send in the request body, either :class:`str`, :class:`bytes`, 103s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 103s 103s :param headers: 103s Dictionary of custom headers to send, such as User-Agent, 103s If-None-Match, etc. If None, pool headers are used. If provided, 103s these headers completely replace any pool-specific headers. 103s 103s :param retries: 103s Configure the number of retries to allow before raising a 103s :class:`~urllib3.exceptions.MaxRetryError` exception. 103s 103s Pass ``None`` to retry until you receive a response. Pass a 103s :class:`~urllib3.util.retry.Retry` object for fine-grained control 103s over different types of retries. 103s Pass an integer number to retry connection errors that many times, 103s but no other types of errors. Pass zero to never retry. 103s 103s If ``False``, then retries are disabled and any exception is raised 103s immediately. Also, instead of raising a MaxRetryError on redirects, 103s the redirect response will be returned. 103s 103s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 103s 103s :param timeout: 103s If specified, overrides the default timeout for this one 103s request. It may be a float (in seconds) or an instance of 103s :class:`urllib3.util.Timeout`. 103s 103s :param chunked: 103s If True, urllib3 will send the body using chunked transfer 103s encoding. Otherwise, urllib3 will send the body using the standard 103s content-length form. Defaults to False. 103s 103s :param response_conn: 103s Set this to ``None`` if you will handle releasing the connection or 103s set the connection to have the response release it. 103s 103s :param preload_content: 103s If True, the response's body will be preloaded during construction. 103s 103s :param decode_content: 103s If True, will attempt to decode the body based on the 103s 'content-encoding' header. 103s 103s :param enforce_content_length: 103s Enforce content length checking. Body returned by server must match 103s value of Content-Length header, if present. Otherwise, raise error. 103s """ 103s self.num_requests += 1 103s 103s timeout_obj = self._get_timeout(timeout) 103s timeout_obj.start_connect() 103s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 103s 103s try: 103s # Trigger any extra validation we need to do. 103s try: 103s self._validate_conn(conn) 103s except (SocketTimeout, BaseSSLError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 103s raise 103s 103s # _validate_conn() starts the connection to an HTTPS proxy 103s # so we need to wrap errors with 'ProxyError' here too. 103s except ( 103s OSError, 103s NewConnectionError, 103s TimeoutError, 103s BaseSSLError, 103s CertificateError, 103s SSLError, 103s ) as e: 103s new_e: Exception = e 103s if isinstance(e, (BaseSSLError, CertificateError)): 103s new_e = SSLError(e) 103s # If the connection didn't successfully connect to it's proxy 103s # then there 103s if isinstance( 103s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 103s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 103s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 103s raise new_e 103s 103s # conn.request() calls http.client.*.request, not the method in 103s # urllib3.request. It also calls makefile (recv) on the socket. 103s try: 103s conn.request( 103s method, 103s url, 103s body=body, 103s headers=headers, 103s chunked=chunked, 103s preload_content=preload_content, 103s decode_content=decode_content, 103s enforce_content_length=enforce_content_length, 103s ) 103s 103s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 103s # legitimately able to close the connection after sending a valid response. 103s # With this behaviour, the received response is still readable. 103s except BrokenPipeError: 103s pass 103s except OSError as e: 103s # MacOS/Linux 103s # EPROTOTYPE and ECONNRESET are needed on macOS 103s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 103s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 103s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 103s raise 103s 103s # Reset the timeout for the recv() on the socket 103s read_timeout = timeout_obj.read_timeout 103s 103s if not conn.is_closed: 103s # In Python 3 socket.py will catch EAGAIN and return None when you 103s # try and read into the file pointer created by http.client, which 103s # instead raises a BadStatusLine exception. Instead of catching 103s # the exception and assuming all BadStatusLine exceptions are read 103s # timeouts, check for a zero timeout before making the request. 103s if read_timeout == 0: 103s raise ReadTimeoutError( 103s self, url, f"Read timed out. (read timeout={read_timeout})" 103s ) 103s conn.timeout = read_timeout 103s 103s # Receive the response from the server 103s try: 103s response = conn.getresponse() 103s except (BaseSSLError, OSError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 103s raise 103s 103s # Set properties that are used by the pooling layer. 103s response.retries = retries 103s response._connection = response_conn # type: ignore[attr-defined] 103s response._pool = self # type: ignore[attr-defined] 103s 103s log.debug( 103s '%s://%s:%s "%s %s %s" %s %s', 103s self.scheme, 103s self.host, 103s self.port, 103s method, 103s url, 103s > response.version_string, 103s response.status, 103s response.length_remaining, 103s ) 103s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 103s 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 103s ----------------------------- Captured stderr call ----------------------------- 103s 127.0.0.1 - - [19/Jan/2025 20:23:17] "GET /basic-auth/user/passwd HTTP/1.1" 401 0 103s _______________________________ test_post[http] ________________________________ 103s 103s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_post_http_0') 103s httpbin_both = 103s verify_pool_mgr = 103s 103s def test_post(tmpdir, httpbin_both, verify_pool_mgr): 103s """Ensure that we can post and cache the results""" 103s data = {"key1": "value1", "key2": "value2"} 103s url = httpbin_both.url + "/post" 103s with vcr.use_cassette(str(tmpdir.join("verify_pool_mgr.yaml"))): 103s > req1 = verify_pool_mgr.request("POST", url, data).data 103s 103s tests/integration/test_urllib3.py:94: 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:143: in request 103s return self.request_encode_body( 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:278: in request_encode_body 103s return self.urlopen(method, url, **extra_kw) 103s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 103s response = conn.urlopen(method, u.request_uri, **kw) 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 103s response = self._make_request( 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s 103s self = 103s conn = 103s method = 'POST', url = '/post', body = {'key1': 'value1', 'key2': 'value2'} 103s headers = HTTPHeaderDict({}) 103s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 103s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 103s chunked = False, response_conn = None, preload_content = True 103s decode_content = True, enforce_content_length = True 103s 103s def _make_request( 103s self, 103s conn: BaseHTTPConnection, 103s method: str, 103s url: str, 103s body: _TYPE_BODY | None = None, 103s headers: typing.Mapping[str, str] | None = None, 103s retries: Retry | None = None, 103s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 103s chunked: bool = False, 103s response_conn: BaseHTTPConnection | None = None, 103s preload_content: bool = True, 103s decode_content: bool = True, 103s enforce_content_length: bool = True, 103s ) -> BaseHTTPResponse: 103s """ 103s Perform a request on a given urllib connection object taken from our 103s pool. 103s 103s :param conn: 103s a connection from one of our connection pools 103s 103s :param method: 103s HTTP request method (such as GET, POST, PUT, etc.) 103s 103s :param url: 103s The URL to perform the request on. 103s 103s :param body: 103s Data to send in the request body, either :class:`str`, :class:`bytes`, 103s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 103s 103s :param headers: 103s Dictionary of custom headers to send, such as User-Agent, 103s If-None-Match, etc. If None, pool headers are used. If provided, 103s these headers completely replace any pool-specific headers. 103s 103s :param retries: 103s Configure the number of retries to allow before raising a 103s :class:`~urllib3.exceptions.MaxRetryError` exception. 103s 103s Pass ``None`` to retry until you receive a response. Pass a 103s :class:`~urllib3.util.retry.Retry` object for fine-grained control 103s over different types of retries. 103s Pass an integer number to retry connection errors that many times, 103s but no other types of errors. Pass zero to never retry. 103s 103s If ``False``, then retries are disabled and any exception is raised 103s immediately. Also, instead of raising a MaxRetryError on redirects, 103s the redirect response will be returned. 103s 103s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 103s 103s :param timeout: 103s If specified, overrides the default timeout for this one 103s request. It may be a float (in seconds) or an instance of 103s :class:`urllib3.util.Timeout`. 103s 103s :param chunked: 103s If True, urllib3 will send the body using chunked transfer 103s encoding. Otherwise, urllib3 will send the body using the standard 103s content-length form. Defaults to False. 103s 103s :param response_conn: 103s Set this to ``None`` if you will handle releasing the connection or 103s set the connection to have the response release it. 103s 103s :param preload_content: 103s If True, the response's body will be preloaded during construction. 103s 103s :param decode_content: 103s If True, will attempt to decode the body based on the 103s 'content-encoding' header. 103s 103s :param enforce_content_length: 103s Enforce content length checking. Body returned by server must match 103s value of Content-Length header, if present. Otherwise, raise error. 103s """ 103s self.num_requests += 1 103s 103s timeout_obj = self._get_timeout(timeout) 103s timeout_obj.start_connect() 103s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 103s 103s try: 103s # Trigger any extra validation we need to do. 103s try: 103s self._validate_conn(conn) 103s except (SocketTimeout, BaseSSLError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 103s raise 103s 103s # _validate_conn() starts the connection to an HTTPS proxy 103s # so we need to wrap errors with 'ProxyError' here too. 103s except ( 103s OSError, 103s NewConnectionError, 103s TimeoutError, 103s BaseSSLError, 103s CertificateError, 103s SSLError, 103s ) as e: 103s new_e: Exception = e 103s if isinstance(e, (BaseSSLError, CertificateError)): 103s new_e = SSLError(e) 103s # If the connection didn't successfully connect to it's proxy 103s # then there 103s if isinstance( 103s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 103s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 103s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 103s raise new_e 103s 103s # conn.request() calls http.client.*.request, not the method in 103s # urllib3.request. It also calls makefile (recv) on the socket. 103s try: 103s conn.request( 103s method, 103s url, 103s body=body, 103s headers=headers, 103s chunked=chunked, 103s preload_content=preload_content, 103s decode_content=decode_content, 103s enforce_content_length=enforce_content_length, 103s ) 103s 103s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 103s # legitimately able to close the connection after sending a valid response. 103s # With this behaviour, the received response is still readable. 103s except BrokenPipeError: 103s pass 103s except OSError as e: 103s # MacOS/Linux 103s # EPROTOTYPE and ECONNRESET are needed on macOS 103s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 103s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 103s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 103s raise 103s 103s # Reset the timeout for the recv() on the socket 103s read_timeout = timeout_obj.read_timeout 103s 103s if not conn.is_closed: 103s # In Python 3 socket.py will catch EAGAIN and return None when you 103s # try and read into the file pointer created by http.client, which 103s # instead raises a BadStatusLine exception. Instead of catching 103s # the exception and assuming all BadStatusLine exceptions are read 103s # timeouts, check for a zero timeout before making the request. 103s if read_timeout == 0: 103s raise ReadTimeoutError( 103s self, url, f"Read timed out. (read timeout={read_timeout})" 103s ) 103s conn.timeout = read_timeout 103s 103s # Receive the response from the server 103s try: 103s response = conn.getresponse() 103s except (BaseSSLError, OSError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 103s raise 103s 103s # Set properties that are used by the pooling layer. 103s response.retries = retries 103s response._connection = response_conn # type: ignore[attr-defined] 103s response._pool = self # type: ignore[attr-defined] 103s 103s log.debug( 103s '%s://%s:%s "%s %s %s" %s %s', 103s self.scheme, 103s self.host, 103s self.port, 103s method, 103s url, 103s > response.version_string, 103s response.status, 103s response.length_remaining, 103s ) 103s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 103s 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 103s ----------------------------- Captured stderr call ----------------------------- 103s 127.0.0.1 - - [19/Jan/2025 20:23:17] "POST /post HTTP/1.1" 501 159 103s _______________________________ test_gzip[http] ________________________________ 103s 103s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_gzip_http_0') 103s httpbin_both = 103s verify_pool_mgr = 103s 103s def test_gzip(tmpdir, httpbin_both, verify_pool_mgr): 103s """ 103s Ensure that requests (actually urllib3) is able to automatically decompress 103s the response body 103s """ 103s url = httpbin_both.url + "/gzip" 103s response = verify_pool_mgr.request("GET", url) 103s 103s with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))): 103s > response = verify_pool_mgr.request("GET", url) 103s 103s tests/integration/test_urllib3.py:140: 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 103s return self.request_encode_url( 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 103s return self.urlopen(method, url, **extra_kw) 103s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 103s response = conn.urlopen(method, u.request_uri, **kw) 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 103s response = self._make_request( 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s 103s self = 103s conn = 103s method = 'GET', url = '/gzip', body = None, headers = {} 103s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 103s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 103s chunked = False, response_conn = None, preload_content = True 103s decode_content = True, enforce_content_length = True 103s 103s def _make_request( 103s self, 103s conn: BaseHTTPConnection, 103s method: str, 103s url: str, 103s body: _TYPE_BODY | None = None, 103s headers: typing.Mapping[str, str] | None = None, 103s retries: Retry | None = None, 103s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 103s chunked: bool = False, 103s response_conn: BaseHTTPConnection | None = None, 103s preload_content: bool = True, 103s decode_content: bool = True, 103s enforce_content_length: bool = True, 103s ) -> BaseHTTPResponse: 103s """ 103s Perform a request on a given urllib connection object taken from our 103s pool. 103s 103s :param conn: 103s a connection from one of our connection pools 103s 103s :param method: 103s HTTP request method (such as GET, POST, PUT, etc.) 103s 103s :param url: 103s The URL to perform the request on. 103s 103s :param body: 103s Data to send in the request body, either :class:`str`, :class:`bytes`, 103s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 103s 103s :param headers: 103s Dictionary of custom headers to send, such as User-Agent, 103s If-None-Match, etc. If None, pool headers are used. If provided, 103s these headers completely replace any pool-specific headers. 103s 103s :param retries: 103s Configure the number of retries to allow before raising a 103s :class:`~urllib3.exceptions.MaxRetryError` exception. 103s 103s Pass ``None`` to retry until you receive a response. Pass a 103s :class:`~urllib3.util.retry.Retry` object for fine-grained control 103s over different types of retries. 103s Pass an integer number to retry connection errors that many times, 103s but no other types of errors. Pass zero to never retry. 103s 103s If ``False``, then retries are disabled and any exception is raised 103s immediately. Also, instead of raising a MaxRetryError on redirects, 103s the redirect response will be returned. 103s 103s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 103s 103s :param timeout: 103s If specified, overrides the default timeout for this one 103s request. It may be a float (in seconds) or an instance of 103s :class:`urllib3.util.Timeout`. 103s 103s :param chunked: 103s If True, urllib3 will send the body using chunked transfer 103s encoding. Otherwise, urllib3 will send the body using the standard 103s content-length form. Defaults to False. 103s 103s :param response_conn: 103s Set this to ``None`` if you will handle releasing the connection or 103s set the connection to have the response release it. 103s 103s :param preload_content: 103s If True, the response's body will be preloaded during construction. 103s 103s :param decode_content: 103s If True, will attempt to decode the body based on the 103s 'content-encoding' header. 103s 103s :param enforce_content_length: 103s Enforce content length checking. Body returned by server must match 103s value of Content-Length header, if present. Otherwise, raise error. 103s """ 103s self.num_requests += 1 103s 103s timeout_obj = self._get_timeout(timeout) 103s timeout_obj.start_connect() 103s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 103s 103s try: 103s # Trigger any extra validation we need to do. 103s try: 103s self._validate_conn(conn) 103s except (SocketTimeout, BaseSSLError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 103s raise 103s 103s # _validate_conn() starts the connection to an HTTPS proxy 103s # so we need to wrap errors with 'ProxyError' here too. 103s except ( 103s OSError, 103s NewConnectionError, 103s TimeoutError, 103s BaseSSLError, 103s CertificateError, 103s SSLError, 103s ) as e: 103s new_e: Exception = e 103s if isinstance(e, (BaseSSLError, CertificateError)): 103s new_e = SSLError(e) 103s # If the connection didn't successfully connect to it's proxy 103s # then there 103s if isinstance( 103s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 103s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 103s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 103s raise new_e 103s 103s # conn.request() calls http.client.*.request, not the method in 103s # urllib3.request. It also calls makefile (recv) on the socket. 103s try: 103s conn.request( 103s method, 103s url, 103s body=body, 103s headers=headers, 103s chunked=chunked, 103s preload_content=preload_content, 103s decode_content=decode_content, 103s enforce_content_length=enforce_content_length, 103s ) 103s 103s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 103s # legitimately able to close the connection after sending a valid response. 103s # With this behaviour, the received response is still readable. 103s except BrokenPipeError: 103s pass 103s except OSError as e: 103s # MacOS/Linux 103s # EPROTOTYPE and ECONNRESET are needed on macOS 103s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 103s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 103s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 103s raise 103s 103s # Reset the timeout for the recv() on the socket 103s read_timeout = timeout_obj.read_timeout 103s 103s if not conn.is_closed: 103s # In Python 3 socket.py will catch EAGAIN and return None when you 103s # try and read into the file pointer created by http.client, which 103s # instead raises a BadStatusLine exception. Instead of catching 103s # the exception and assuming all BadStatusLine exceptions are read 103s # timeouts, check for a zero timeout before making the request. 103s if read_timeout == 0: 103s raise ReadTimeoutError( 103s self, url, f"Read timed out. (read timeout={read_timeout})" 103s ) 103s conn.timeout = read_timeout 103s 103s # Receive the response from the server 103s try: 103s response = conn.getresponse() 103s except (BaseSSLError, OSError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 103s raise 103s 103s # Set properties that are used by the pooling layer. 103s response.retries = retries 103s response._connection = response_conn # type: ignore[attr-defined] 103s response._pool = self # type: ignore[attr-defined] 103s 103s log.debug( 103s '%s://%s:%s "%s %s %s" %s %s', 103s self.scheme, 103s self.host, 103s self.port, 103s method, 103s url, 103s > response.version_string, 103s response.status, 103s response.length_remaining, 103s ) 103s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 103s 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 103s ----------------------------- Captured stderr call ----------------------------- 103s 127.0.0.1 - - [19/Jan/2025 20:23:17] "GET /gzip HTTP/1.1" 200 165 103s 127.0.0.1 - - [19/Jan/2025 20:23:17] "GET /gzip HTTP/1.1" 200 165 103s ___________________________ test_status_code[https] ____________________________ 103s 103s httpbin_both = 103s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_status_code_https_0') 103s verify_pool_mgr = 103s 103s def test_status_code(httpbin_both, tmpdir, verify_pool_mgr): 103s """Ensure that we can read the status code""" 103s url = httpbin_both.url 103s with vcr.use_cassette(str(tmpdir.join("atts.yaml"))): 103s > status_code = verify_pool_mgr.request("GET", url).status 103s 103s tests/integration/test_urllib3.py:34: 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 103s return self.request_encode_url( 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 103s return self.urlopen(method, url, **extra_kw) 103s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 103s response = conn.urlopen(method, u.request_uri, **kw) 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 103s response = self._make_request( 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s 103s self = 103s conn = 103s method = 'GET', url = '/', body = None, headers = {} 103s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 103s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 103s chunked = False, response_conn = None, preload_content = True 103s decode_content = True, enforce_content_length = True 103s 103s def _make_request( 103s self, 103s conn: BaseHTTPConnection, 103s method: str, 103s url: str, 103s body: _TYPE_BODY | None = None, 103s headers: typing.Mapping[str, str] | None = None, 103s retries: Retry | None = None, 103s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 103s chunked: bool = False, 103s response_conn: BaseHTTPConnection | None = None, 103s preload_content: bool = True, 103s decode_content: bool = True, 103s enforce_content_length: bool = True, 103s ) -> BaseHTTPResponse: 103s """ 103s Perform a request on a given urllib connection object taken from our 103s pool. 103s 103s :param conn: 103s a connection from one of our connection pools 103s 103s :param method: 103s HTTP request method (such as GET, POST, PUT, etc.) 103s 103s :param url: 103s The URL to perform the request on. 103s 103s :param body: 103s Data to send in the request body, either :class:`str`, :class:`bytes`, 103s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 103s 103s :param headers: 103s Dictionary of custom headers to send, such as User-Agent, 103s If-None-Match, etc. If None, pool headers are used. If provided, 103s these headers completely replace any pool-specific headers. 103s 103s :param retries: 103s Configure the number of retries to allow before raising a 103s :class:`~urllib3.exceptions.MaxRetryError` exception. 103s 103s Pass ``None`` to retry until you receive a response. Pass a 103s :class:`~urllib3.util.retry.Retry` object for fine-grained control 103s over different types of retries. 103s Pass an integer number to retry connection errors that many times, 103s but no other types of errors. Pass zero to never retry. 103s 103s If ``False``, then retries are disabled and any exception is raised 103s immediately. Also, instead of raising a MaxRetryError on redirects, 103s the redirect response will be returned. 103s 103s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 103s 103s :param timeout: 103s If specified, overrides the default timeout for this one 103s request. It may be a float (in seconds) or an instance of 103s :class:`urllib3.util.Timeout`. 103s 103s :param chunked: 103s If True, urllib3 will send the body using chunked transfer 103s encoding. Otherwise, urllib3 will send the body using the standard 103s content-length form. Defaults to False. 103s 103s :param response_conn: 103s Set this to ``None`` if you will handle releasing the connection or 103s set the connection to have the response release it. 103s 103s :param preload_content: 103s If True, the response's body will be preloaded during construction. 103s 103s :param decode_content: 103s If True, will attempt to decode the body based on the 103s 'content-encoding' header. 103s 103s :param enforce_content_length: 103s Enforce content length checking. Body returned by server must match 103s value of Content-Length header, if present. Otherwise, raise error. 103s """ 103s self.num_requests += 1 103s 103s timeout_obj = self._get_timeout(timeout) 103s timeout_obj.start_connect() 103s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 103s 103s try: 103s # Trigger any extra validation we need to do. 103s try: 103s self._validate_conn(conn) 103s except (SocketTimeout, BaseSSLError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 103s raise 103s 103s # _validate_conn() starts the connection to an HTTPS proxy 103s # so we need to wrap errors with 'ProxyError' here too. 103s except ( 103s OSError, 103s NewConnectionError, 103s TimeoutError, 103s BaseSSLError, 103s CertificateError, 103s SSLError, 103s ) as e: 103s new_e: Exception = e 103s if isinstance(e, (BaseSSLError, CertificateError)): 103s new_e = SSLError(e) 103s # If the connection didn't successfully connect to it's proxy 103s # then there 103s if isinstance( 103s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 103s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 103s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 103s raise new_e 103s 103s # conn.request() calls http.client.*.request, not the method in 103s # urllib3.request. It also calls makefile (recv) on the socket. 103s try: 103s conn.request( 103s method, 103s url, 103s body=body, 103s headers=headers, 103s chunked=chunked, 103s preload_content=preload_content, 103s decode_content=decode_content, 103s enforce_content_length=enforce_content_length, 103s ) 103s 103s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 103s # legitimately able to close the connection after sending a valid response. 103s # With this behaviour, the received response is still readable. 103s except BrokenPipeError: 103s pass 103s except OSError as e: 103s # MacOS/Linux 103s # EPROTOTYPE and ECONNRESET are needed on macOS 103s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 103s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 103s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 103s raise 103s 103s # Reset the timeout for the recv() on the socket 103s read_timeout = timeout_obj.read_timeout 103s 103s if not conn.is_closed: 103s # In Python 3 socket.py will catch EAGAIN and return None when you 103s # try and read into the file pointer created by http.client, which 103s # instead raises a BadStatusLine exception. Instead of catching 103s # the exception and assuming all BadStatusLine exceptions are read 103s # timeouts, check for a zero timeout before making the request. 103s if read_timeout == 0: 103s raise ReadTimeoutError( 103s self, url, f"Read timed out. (read timeout={read_timeout})" 103s ) 103s conn.timeout = read_timeout 103s 103s # Receive the response from the server 103s try: 103s response = conn.getresponse() 103s except (BaseSSLError, OSError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 103s raise 103s 103s # Set properties that are used by the pooling layer. 103s response.retries = retries 103s response._connection = response_conn # type: ignore[attr-defined] 103s response._pool = self # type: ignore[attr-defined] 103s 103s log.debug( 103s '%s://%s:%s "%s %s %s" %s %s', 103s self.scheme, 103s self.host, 103s self.port, 103s method, 103s url, 103s > response.version_string, 103s response.status, 103s response.length_remaining, 103s ) 103s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 103s 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 103s ----------------------------- Captured stderr call ----------------------------- 103s 127.0.0.1 - - [19/Jan/2025 20:23:17] "GET / HTTP/1.1" 200 9358 103s _____________________________ test_headers[https] ______________________________ 103s 103s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_headers_https_0') 103s httpbin_both = 103s verify_pool_mgr = 103s 103s def test_headers(tmpdir, httpbin_both, verify_pool_mgr): 103s """Ensure that we can read the headers back""" 103s url = httpbin_both.url 103s with vcr.use_cassette(str(tmpdir.join("headers.yaml"))): 103s > headers = verify_pool_mgr.request("GET", url).headers 103s 103s tests/integration/test_urllib3.py:44: 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 103s return self.request_encode_url( 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 103s return self.urlopen(method, url, **extra_kw) 103s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 103s response = conn.urlopen(method, u.request_uri, **kw) 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 103s response = self._make_request( 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s 103s self = 103s conn = 103s method = 'GET', url = '/', body = None, headers = {} 103s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 103s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 103s chunked = False, response_conn = None, preload_content = True 103s decode_content = True, enforce_content_length = True 103s 103s def _make_request( 103s self, 103s conn: BaseHTTPConnection, 103s method: str, 103s url: str, 103s body: _TYPE_BODY | None = None, 103s headers: typing.Mapping[str, str] | None = None, 103s retries: Retry | None = None, 103s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 103s chunked: bool = False, 103s response_conn: BaseHTTPConnection | None = None, 103s preload_content: bool = True, 103s decode_content: bool = True, 103s enforce_content_length: bool = True, 103s ) -> BaseHTTPResponse: 103s """ 103s Perform a request on a given urllib connection object taken from our 103s pool. 103s 103s :param conn: 103s a connection from one of our connection pools 103s 103s :param method: 103s HTTP request method (such as GET, POST, PUT, etc.) 103s 103s :param url: 103s The URL to perform the request on. 103s 103s :param body: 103s Data to send in the request body, either :class:`str`, :class:`bytes`, 103s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 103s 103s :param headers: 103s Dictionary of custom headers to send, such as User-Agent, 103s If-None-Match, etc. If None, pool headers are used. If provided, 103s these headers completely replace any pool-specific headers. 103s 103s :param retries: 103s Configure the number of retries to allow before raising a 103s :class:`~urllib3.exceptions.MaxRetryError` exception. 103s 103s Pass ``None`` to retry until you receive a response. Pass a 103s :class:`~urllib3.util.retry.Retry` object for fine-grained control 103s over different types of retries. 103s Pass an integer number to retry connection errors that many times, 103s but no other types of errors. Pass zero to never retry. 103s 103s If ``False``, then retries are disabled and any exception is raised 103s immediately. Also, instead of raising a MaxRetryError on redirects, 103s the redirect response will be returned. 103s 103s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 103s 103s :param timeout: 103s If specified, overrides the default timeout for this one 103s request. It may be a float (in seconds) or an instance of 103s :class:`urllib3.util.Timeout`. 103s 103s :param chunked: 103s If True, urllib3 will send the body using chunked transfer 103s encoding. Otherwise, urllib3 will send the body using the standard 103s content-length form. Defaults to False. 103s 103s :param response_conn: 103s Set this to ``None`` if you will handle releasing the connection or 103s set the connection to have the response release it. 103s 103s :param preload_content: 103s If True, the response's body will be preloaded during construction. 103s 103s :param decode_content: 103s If True, will attempt to decode the body based on the 103s 'content-encoding' header. 103s 103s :param enforce_content_length: 103s Enforce content length checking. Body returned by server must match 103s value of Content-Length header, if present. Otherwise, raise error. 103s """ 103s self.num_requests += 1 103s 103s timeout_obj = self._get_timeout(timeout) 103s timeout_obj.start_connect() 103s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 103s 103s try: 103s # Trigger any extra validation we need to do. 103s try: 103s self._validate_conn(conn) 103s except (SocketTimeout, BaseSSLError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 103s raise 103s 103s # _validate_conn() starts the connection to an HTTPS proxy 103s # so we need to wrap errors with 'ProxyError' here too. 103s except ( 103s OSError, 103s NewConnectionError, 103s TimeoutError, 103s BaseSSLError, 103s CertificateError, 103s SSLError, 103s ) as e: 103s new_e: Exception = e 103s if isinstance(e, (BaseSSLError, CertificateError)): 103s new_e = SSLError(e) 103s # If the connection didn't successfully connect to it's proxy 103s # then there 103s if isinstance( 103s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 103s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 103s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 103s raise new_e 103s 103s # conn.request() calls http.client.*.request, not the method in 103s # urllib3.request. It also calls makefile (recv) on the socket. 103s try: 103s conn.request( 103s method, 103s url, 103s body=body, 103s headers=headers, 103s chunked=chunked, 103s preload_content=preload_content, 103s decode_content=decode_content, 103s enforce_content_length=enforce_content_length, 103s ) 103s 103s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 103s # legitimately able to close the connection after sending a valid response. 103s # With this behaviour, the received response is still readable. 103s except BrokenPipeError: 103s pass 103s except OSError as e: 103s # MacOS/Linux 103s # EPROTOTYPE and ECONNRESET are needed on macOS 103s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 103s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 103s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 103s raise 103s 103s # Reset the timeout for the recv() on the socket 103s read_timeout = timeout_obj.read_timeout 103s 103s if not conn.is_closed: 103s # In Python 3 socket.py will catch EAGAIN and return None when you 103s # try and read into the file pointer created by http.client, which 103s # instead raises a BadStatusLine exception. Instead of catching 103s # the exception and assuming all BadStatusLine exceptions are read 103s # timeouts, check for a zero timeout before making the request. 103s if read_timeout == 0: 103s raise ReadTimeoutError( 103s self, url, f"Read timed out. (read timeout={read_timeout})" 103s ) 103s conn.timeout = read_timeout 103s 103s # Receive the response from the server 103s try: 103s response = conn.getresponse() 103s except (BaseSSLError, OSError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 103s raise 103s 103s # Set properties that are used by the pooling layer. 103s response.retries = retries 103s response._connection = response_conn # type: ignore[attr-defined] 103s response._pool = self # type: ignore[attr-defined] 103s 103s log.debug( 103s '%s://%s:%s "%s %s %s" %s %s', 103s self.scheme, 103s self.host, 103s self.port, 103s method, 103s url, 103s > response.version_string, 103s response.status, 103s response.length_remaining, 103s ) 103s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 103s 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 103s ----------------------------- Captured stderr call ----------------------------- 103s 127.0.0.1 - - [19/Jan/2025 20:23:17] "GET / HTTP/1.1" 200 9358 103s _______________________________ test_body[https] _______________________________ 103s 103s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_body_https_0') 103s httpbin_both = 103s verify_pool_mgr = 103s 103s def test_body(tmpdir, httpbin_both, verify_pool_mgr): 103s """Ensure the responses are all identical enough""" 103s url = httpbin_both.url + "/bytes/1024" 103s with vcr.use_cassette(str(tmpdir.join("body.yaml"))): 103s > content = verify_pool_mgr.request("GET", url).data 103s 103s tests/integration/test_urllib3.py:55: 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 103s return self.request_encode_url( 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 103s return self.urlopen(method, url, **extra_kw) 103s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 103s response = conn.urlopen(method, u.request_uri, **kw) 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 103s response = self._make_request( 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s 103s self = 103s conn = 103s method = 'GET', url = '/bytes/1024', body = None, headers = {} 103s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 103s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 103s chunked = False, response_conn = None, preload_content = True 103s decode_content = True, enforce_content_length = True 103s 103s def _make_request( 103s self, 103s conn: BaseHTTPConnection, 103s method: str, 103s url: str, 103s body: _TYPE_BODY | None = None, 103s headers: typing.Mapping[str, str] | None = None, 103s retries: Retry | None = None, 103s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 103s chunked: bool = False, 103s response_conn: BaseHTTPConnection | None = None, 103s preload_content: bool = True, 103s decode_content: bool = True, 103s enforce_content_length: bool = True, 103s ) -> BaseHTTPResponse: 103s """ 103s Perform a request on a given urllib connection object taken from our 103s pool. 103s 103s :param conn: 103s a connection from one of our connection pools 103s 103s :param method: 103s HTTP request method (such as GET, POST, PUT, etc.) 103s 103s :param url: 103s The URL to perform the request on. 103s 103s :param body: 103s Data to send in the request body, either :class:`str`, :class:`bytes`, 103s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 103s 103s :param headers: 103s Dictionary of custom headers to send, such as User-Agent, 103s If-None-Match, etc. If None, pool headers are used. If provided, 103s these headers completely replace any pool-specific headers. 103s 103s :param retries: 103s Configure the number of retries to allow before raising a 103s :class:`~urllib3.exceptions.MaxRetryError` exception. 103s 103s Pass ``None`` to retry until you receive a response. Pass a 103s :class:`~urllib3.util.retry.Retry` object for fine-grained control 103s over different types of retries. 103s Pass an integer number to retry connection errors that many times, 103s but no other types of errors. Pass zero to never retry. 103s 103s If ``False``, then retries are disabled and any exception is raised 103s immediately. Also, instead of raising a MaxRetryError on redirects, 103s the redirect response will be returned. 103s 103s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 103s 103s :param timeout: 103s If specified, overrides the default timeout for this one 103s request. It may be a float (in seconds) or an instance of 103s :class:`urllib3.util.Timeout`. 103s 103s :param chunked: 103s If True, urllib3 will send the body using chunked transfer 103s encoding. Otherwise, urllib3 will send the body using the standard 103s content-length form. Defaults to False. 103s 103s :param response_conn: 103s Set this to ``None`` if you will handle releasing the connection or 103s set the connection to have the response release it. 103s 103s :param preload_content: 103s If True, the response's body will be preloaded during construction. 103s 103s :param decode_content: 103s If True, will attempt to decode the body based on the 103s 'content-encoding' header. 103s 103s :param enforce_content_length: 103s Enforce content length checking. Body returned by server must match 103s value of Content-Length header, if present. Otherwise, raise error. 103s """ 103s self.num_requests += 1 103s 103s timeout_obj = self._get_timeout(timeout) 103s timeout_obj.start_connect() 103s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 103s 103s try: 103s # Trigger any extra validation we need to do. 103s try: 103s self._validate_conn(conn) 103s except (SocketTimeout, BaseSSLError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 103s raise 103s 103s # _validate_conn() starts the connection to an HTTPS proxy 103s # so we need to wrap errors with 'ProxyError' here too. 103s except ( 103s OSError, 103s NewConnectionError, 103s TimeoutError, 103s BaseSSLError, 103s CertificateError, 103s SSLError, 103s ) as e: 103s new_e: Exception = e 103s if isinstance(e, (BaseSSLError, CertificateError)): 103s new_e = SSLError(e) 103s # If the connection didn't successfully connect to it's proxy 103s # then there 103s if isinstance( 103s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 103s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 103s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 103s raise new_e 103s 103s # conn.request() calls http.client.*.request, not the method in 103s # urllib3.request. It also calls makefile (recv) on the socket. 103s try: 103s conn.request( 103s method, 103s url, 103s body=body, 103s headers=headers, 103s chunked=chunked, 103s preload_content=preload_content, 103s decode_content=decode_content, 103s enforce_content_length=enforce_content_length, 103s ) 103s 103s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 103s # legitimately able to close the connection after sending a valid response. 103s # With this behaviour, the received response is still readable. 103s except BrokenPipeError: 103s pass 103s except OSError as e: 103s # MacOS/Linux 103s # EPROTOTYPE and ECONNRESET are needed on macOS 103s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 103s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 103s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 103s raise 103s 103s # Reset the timeout for the recv() on the socket 103s read_timeout = timeout_obj.read_timeout 103s 103s if not conn.is_closed: 103s # In Python 3 socket.py will catch EAGAIN and return None when you 103s # try and read into the file pointer created by http.client, which 103s # instead raises a BadStatusLine exception. Instead of catching 103s # the exception and assuming all BadStatusLine exceptions are read 103s # timeouts, check for a zero timeout before making the request. 103s if read_timeout == 0: 103s raise ReadTimeoutError( 103s self, url, f"Read timed out. (read timeout={read_timeout})" 103s ) 103s conn.timeout = read_timeout 103s 103s # Receive the response from the server 103s try: 103s response = conn.getresponse() 103s except (BaseSSLError, OSError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 103s raise 103s 103s # Set properties that are used by the pooling layer. 103s response.retries = retries 103s response._connection = response_conn # type: ignore[attr-defined] 103s response._pool = self # type: ignore[attr-defined] 103s 103s log.debug( 103s '%s://%s:%s "%s %s %s" %s %s', 103s self.scheme, 103s self.host, 103s self.port, 103s method, 103s url, 103s > response.version_string, 103s response.status, 103s response.length_remaining, 103s ) 103s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 103s 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 103s ----------------------------- Captured stderr call ----------------------------- 103s 127.0.0.1 - - [19/Jan/2025 20:23:17] "GET /bytes/1024 HTTP/1.1" 200 1024 103s _______________________________ test_auth[https] _______________________________ 103s 103s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_auth_https_0') 103s httpbin_both = 103s verify_pool_mgr = 103s 103s def test_auth(tmpdir, httpbin_both, verify_pool_mgr): 103s """Ensure that we can handle basic auth""" 103s auth = ("user", "passwd") 103s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 103s url = httpbin_both.url + "/basic-auth/user/passwd" 103s with vcr.use_cassette(str(tmpdir.join("auth.yaml"))): 103s > one = verify_pool_mgr.request("GET", url, headers=headers) 103s 103s tests/integration/test_urllib3.py:67: 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 103s return self.request_encode_url( 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 103s return self.urlopen(method, url, **extra_kw) 103s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 103s response = conn.urlopen(method, u.request_uri, **kw) 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 103s response = self._make_request( 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s 103s self = 103s conn = 103s method = 'GET', url = '/basic-auth/user/passwd', body = None 103s headers = {'authorization': 'Basic dXNlcjpwYXNzd2Q='} 103s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 103s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 103s chunked = False, response_conn = None, preload_content = True 103s decode_content = True, enforce_content_length = True 103s 103s def _make_request( 103s self, 103s conn: BaseHTTPConnection, 103s method: str, 103s url: str, 103s body: _TYPE_BODY | None = None, 103s headers: typing.Mapping[str, str] | None = None, 103s retries: Retry | None = None, 103s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 103s chunked: bool = False, 103s response_conn: BaseHTTPConnection | None = None, 103s preload_content: bool = True, 103s decode_content: bool = True, 103s enforce_content_length: bool = True, 103s ) -> BaseHTTPResponse: 103s """ 103s Perform a request on a given urllib connection object taken from our 103s pool. 103s 103s :param conn: 103s a connection from one of our connection pools 103s 103s :param method: 103s HTTP request method (such as GET, POST, PUT, etc.) 103s 103s :param url: 103s The URL to perform the request on. 103s 103s :param body: 103s Data to send in the request body, either :class:`str`, :class:`bytes`, 103s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 103s 103s :param headers: 103s Dictionary of custom headers to send, such as User-Agent, 103s If-None-Match, etc. If None, pool headers are used. If provided, 103s these headers completely replace any pool-specific headers. 103s 103s :param retries: 103s Configure the number of retries to allow before raising a 103s :class:`~urllib3.exceptions.MaxRetryError` exception. 103s 103s Pass ``None`` to retry until you receive a response. Pass a 103s :class:`~urllib3.util.retry.Retry` object for fine-grained control 103s over different types of retries. 103s Pass an integer number to retry connection errors that many times, 103s but no other types of errors. Pass zero to never retry. 103s 103s If ``False``, then retries are disabled and any exception is raised 103s immediately. Also, instead of raising a MaxRetryError on redirects, 103s the redirect response will be returned. 103s 103s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 103s 103s :param timeout: 103s If specified, overrides the default timeout for this one 103s request. It may be a float (in seconds) or an instance of 103s :class:`urllib3.util.Timeout`. 103s 103s :param chunked: 103s If True, urllib3 will send the body using chunked transfer 103s encoding. Otherwise, urllib3 will send the body using the standard 103s content-length form. Defaults to False. 103s 103s :param response_conn: 103s Set this to ``None`` if you will handle releasing the connection or 103s set the connection to have the response release it. 103s 103s :param preload_content: 103s If True, the response's body will be preloaded during construction. 103s 103s :param decode_content: 103s If True, will attempt to decode the body based on the 103s 'content-encoding' header. 103s 103s :param enforce_content_length: 103s Enforce content length checking. Body returned by server must match 103s value of Content-Length header, if present. Otherwise, raise error. 103s """ 103s self.num_requests += 1 103s 103s timeout_obj = self._get_timeout(timeout) 103s timeout_obj.start_connect() 103s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 103s 103s try: 103s # Trigger any extra validation we need to do. 103s try: 103s self._validate_conn(conn) 103s except (SocketTimeout, BaseSSLError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 103s raise 103s 103s # _validate_conn() starts the connection to an HTTPS proxy 103s # so we need to wrap errors with 'ProxyError' here too. 103s except ( 103s OSError, 103s NewConnectionError, 103s TimeoutError, 103s BaseSSLError, 103s CertificateError, 103s SSLError, 103s ) as e: 103s new_e: Exception = e 103s if isinstance(e, (BaseSSLError, CertificateError)): 103s new_e = SSLError(e) 103s # If the connection didn't successfully connect to it's proxy 103s # then there 103s if isinstance( 103s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 103s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 103s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 103s raise new_e 103s 103s # conn.request() calls http.client.*.request, not the method in 103s # urllib3.request. It also calls makefile (recv) on the socket. 103s try: 103s conn.request( 103s method, 103s url, 103s body=body, 103s headers=headers, 103s chunked=chunked, 103s preload_content=preload_content, 103s decode_content=decode_content, 103s enforce_content_length=enforce_content_length, 103s ) 103s 103s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 103s # legitimately able to close the connection after sending a valid response. 103s # With this behaviour, the received response is still readable. 103s except BrokenPipeError: 103s pass 103s except OSError as e: 103s # MacOS/Linux 103s # EPROTOTYPE and ECONNRESET are needed on macOS 103s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 103s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 103s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 103s raise 103s 103s # Reset the timeout for the recv() on the socket 103s read_timeout = timeout_obj.read_timeout 103s 103s if not conn.is_closed: 103s # In Python 3 socket.py will catch EAGAIN and return None when you 103s # try and read into the file pointer created by http.client, which 103s # instead raises a BadStatusLine exception. Instead of catching 103s # the exception and assuming all BadStatusLine exceptions are read 103s # timeouts, check for a zero timeout before making the request. 103s if read_timeout == 0: 103s raise ReadTimeoutError( 103s self, url, f"Read timed out. (read timeout={read_timeout})" 103s ) 103s conn.timeout = read_timeout 103s 103s # Receive the response from the server 103s try: 103s response = conn.getresponse() 103s except (BaseSSLError, OSError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 103s raise 103s 103s # Set properties that are used by the pooling layer. 103s response.retries = retries 103s response._connection = response_conn # type: ignore[attr-defined] 103s response._pool = self # type: ignore[attr-defined] 103s 103s log.debug( 103s '%s://%s:%s "%s %s %s" %s %s', 103s self.scheme, 103s self.host, 103s self.port, 103s method, 103s url, 103s > response.version_string, 103s response.status, 103s response.length_remaining, 103s ) 103s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 103s 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 103s ----------------------------- Captured stderr call ----------------------------- 103s 127.0.0.1 - - [19/Jan/2025 20:23:17] "GET /basic-auth/user/passwd HTTP/1.1" 200 46 103s ___________________________ test_auth_failed[https] ____________________________ 103s 103s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_auth_failed_https_0') 103s httpbin_both = 103s verify_pool_mgr = 103s 103s def test_auth_failed(tmpdir, httpbin_both, verify_pool_mgr): 103s """Ensure that we can save failed auth statuses""" 103s auth = ("user", "wrongwrongwrong") 103s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 103s url = httpbin_both.url + "/basic-auth/user/passwd" 103s with vcr.use_cassette(str(tmpdir.join("auth-failed.yaml"))) as cass: 103s # Ensure that this is empty to begin with 103s assert_cassette_empty(cass) 103s > one = verify_pool_mgr.request("GET", url, headers=headers) 103s 103s tests/integration/test_urllib3.py:83: 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 103s return self.request_encode_url( 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 103s return self.urlopen(method, url, **extra_kw) 103s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 103s response = conn.urlopen(method, u.request_uri, **kw) 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 103s response = self._make_request( 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s 103s self = 103s conn = 103s method = 'GET', url = '/basic-auth/user/passwd', body = None 103s headers = {'authorization': 'Basic dXNlcjp3cm9uZ3dyb25nd3Jvbmc='} 103s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 103s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 103s chunked = False, response_conn = None, preload_content = True 103s decode_content = True, enforce_content_length = True 103s 103s def _make_request( 103s self, 103s conn: BaseHTTPConnection, 103s method: str, 103s url: str, 103s body: _TYPE_BODY | None = None, 103s headers: typing.Mapping[str, str] | None = None, 103s retries: Retry | None = None, 103s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 103s chunked: bool = False, 103s response_conn: BaseHTTPConnection | None = None, 103s preload_content: bool = True, 103s decode_content: bool = True, 103s enforce_content_length: bool = True, 103s ) -> BaseHTTPResponse: 103s """ 103s Perform a request on a given urllib connection object taken from our 103s pool. 103s 103s :param conn: 103s a connection from one of our connection pools 103s 103s :param method: 103s HTTP request method (such as GET, POST, PUT, etc.) 103s 103s :param url: 103s The URL to perform the request on. 103s 103s :param body: 103s Data to send in the request body, either :class:`str`, :class:`bytes`, 103s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 103s 103s :param headers: 103s Dictionary of custom headers to send, such as User-Agent, 103s If-None-Match, etc. If None, pool headers are used. If provided, 103s these headers completely replace any pool-specific headers. 103s 103s :param retries: 103s Configure the number of retries to allow before raising a 103s :class:`~urllib3.exceptions.MaxRetryError` exception. 103s 103s Pass ``None`` to retry until you receive a response. Pass a 103s :class:`~urllib3.util.retry.Retry` object for fine-grained control 103s over different types of retries. 103s Pass an integer number to retry connection errors that many times, 103s but no other types of errors. Pass zero to never retry. 103s 103s If ``False``, then retries are disabled and any exception is raised 103s immediately. Also, instead of raising a MaxRetryError on redirects, 103s the redirect response will be returned. 103s 103s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 103s 103s :param timeout: 103s If specified, overrides the default timeout for this one 103s request. It may be a float (in seconds) or an instance of 103s :class:`urllib3.util.Timeout`. 103s 103s :param chunked: 103s If True, urllib3 will send the body using chunked transfer 103s encoding. Otherwise, urllib3 will send the body using the standard 103s content-length form. Defaults to False. 103s 103s :param response_conn: 103s Set this to ``None`` if you will handle releasing the connection or 103s set the connection to have the response release it. 103s 103s :param preload_content: 103s If True, the response's body will be preloaded during construction. 103s 103s :param decode_content: 103s If True, will attempt to decode the body based on the 103s 'content-encoding' header. 103s 103s :param enforce_content_length: 103s Enforce content length checking. Body returned by server must match 103s value of Content-Length header, if present. Otherwise, raise error. 103s """ 103s self.num_requests += 1 103s 103s timeout_obj = self._get_timeout(timeout) 103s timeout_obj.start_connect() 103s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 103s 103s try: 103s # Trigger any extra validation we need to do. 103s try: 103s self._validate_conn(conn) 103s except (SocketTimeout, BaseSSLError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 103s raise 103s 103s # _validate_conn() starts the connection to an HTTPS proxy 103s # so we need to wrap errors with 'ProxyError' here too. 103s except ( 103s OSError, 103s NewConnectionError, 103s TimeoutError, 103s BaseSSLError, 103s CertificateError, 103s SSLError, 103s ) as e: 103s new_e: Exception = e 103s if isinstance(e, (BaseSSLError, CertificateError)): 103s new_e = SSLError(e) 103s # If the connection didn't successfully connect to it's proxy 103s # then there 103s if isinstance( 103s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 103s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 103s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 103s raise new_e 103s 103s # conn.request() calls http.client.*.request, not the method in 103s # urllib3.request. It also calls makefile (recv) on the socket. 103s try: 103s conn.request( 103s method, 103s url, 103s body=body, 103s headers=headers, 103s chunked=chunked, 103s preload_content=preload_content, 103s decode_content=decode_content, 103s enforce_content_length=enforce_content_length, 103s ) 103s 103s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 103s # legitimately able to close the connection after sending a valid response. 103s # With this behaviour, the received response is still readable. 103s except BrokenPipeError: 103s pass 103s except OSError as e: 103s # MacOS/Linux 103s # EPROTOTYPE and ECONNRESET are needed on macOS 103s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 103s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 103s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 103s raise 103s 103s # Reset the timeout for the recv() on the socket 103s read_timeout = timeout_obj.read_timeout 103s 103s if not conn.is_closed: 103s # In Python 3 socket.py will catch EAGAIN and return None when you 103s # try and read into the file pointer created by http.client, which 103s # instead raises a BadStatusLine exception. Instead of catching 103s # the exception and assuming all BadStatusLine exceptions are read 103s # timeouts, check for a zero timeout before making the request. 103s if read_timeout == 0: 103s raise ReadTimeoutError( 103s self, url, f"Read timed out. (read timeout={read_timeout})" 103s ) 103s conn.timeout = read_timeout 103s 103s # Receive the response from the server 103s try: 103s response = conn.getresponse() 103s except (BaseSSLError, OSError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 103s raise 103s 103s # Set properties that are used by the pooling layer. 103s response.retries = retries 103s response._connection = response_conn # type: ignore[attr-defined] 103s response._pool = self # type: ignore[attr-defined] 103s 103s log.debug( 103s '%s://%s:%s "%s %s %s" %s %s', 103s self.scheme, 103s self.host, 103s self.port, 103s method, 103s url, 103s > response.version_string, 103s response.status, 103s response.length_remaining, 103s ) 103s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 103s 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 103s ----------------------------- Captured stderr call ----------------------------- 103s 127.0.0.1 - - [19/Jan/2025 20:23:17] "GET /basic-auth/user/passwd HTTP/1.1" 401 0 103s _______________________________ test_post[https] _______________________________ 103s 103s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_post_https_0') 103s httpbin_both = 103s verify_pool_mgr = 103s 103s def test_post(tmpdir, httpbin_both, verify_pool_mgr): 103s """Ensure that we can post and cache the results""" 103s data = {"key1": "value1", "key2": "value2"} 103s url = httpbin_both.url + "/post" 103s with vcr.use_cassette(str(tmpdir.join("verify_pool_mgr.yaml"))): 103s > req1 = verify_pool_mgr.request("POST", url, data).data 103s 103s tests/integration/test_urllib3.py:94: 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:143: in request 103s return self.request_encode_body( 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:278: in request_encode_body 103s return self.urlopen(method, url, **extra_kw) 103s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 103s response = conn.urlopen(method, u.request_uri, **kw) 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 103s response = self._make_request( 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s 103s self = 103s conn = 103s method = 'POST', url = '/post', body = {'key1': 'value1', 'key2': 'value2'} 103s headers = HTTPHeaderDict({}) 103s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 103s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 103s chunked = False, response_conn = None, preload_content = True 103s decode_content = True, enforce_content_length = True 103s 103s def _make_request( 103s self, 103s conn: BaseHTTPConnection, 103s method: str, 103s url: str, 103s body: _TYPE_BODY | None = None, 103s headers: typing.Mapping[str, str] | None = None, 103s retries: Retry | None = None, 103s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 103s chunked: bool = False, 103s response_conn: BaseHTTPConnection | None = None, 103s preload_content: bool = True, 103s decode_content: bool = True, 103s enforce_content_length: bool = True, 103s ) -> BaseHTTPResponse: 103s """ 103s Perform a request on a given urllib connection object taken from our 103s pool. 103s 103s :param conn: 103s a connection from one of our connection pools 103s 103s :param method: 103s HTTP request method (such as GET, POST, PUT, etc.) 103s 103s :param url: 103s The URL to perform the request on. 103s 103s :param body: 103s Data to send in the request body, either :class:`str`, :class:`bytes`, 103s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 103s 103s :param headers: 103s Dictionary of custom headers to send, such as User-Agent, 103s If-None-Match, etc. If None, pool headers are used. If provided, 103s these headers completely replace any pool-specific headers. 103s 103s :param retries: 103s Configure the number of retries to allow before raising a 103s :class:`~urllib3.exceptions.MaxRetryError` exception. 103s 103s Pass ``None`` to retry until you receive a response. Pass a 103s :class:`~urllib3.util.retry.Retry` object for fine-grained control 103s over different types of retries. 103s Pass an integer number to retry connection errors that many times, 103s but no other types of errors. Pass zero to never retry. 103s 103s If ``False``, then retries are disabled and any exception is raised 103s immediately. Also, instead of raising a MaxRetryError on redirects, 103s the redirect response will be returned. 103s 103s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 103s 103s :param timeout: 103s If specified, overrides the default timeout for this one 103s request. It may be a float (in seconds) or an instance of 103s :class:`urllib3.util.Timeout`. 103s 103s :param chunked: 103s If True, urllib3 will send the body using chunked transfer 103s encoding. Otherwise, urllib3 will send the body using the standard 103s content-length form. Defaults to False. 103s 103s :param response_conn: 103s Set this to ``None`` if you will handle releasing the connection or 103s set the connection to have the response release it. 103s 103s :param preload_content: 103s If True, the response's body will be preloaded during construction. 103s 103s :param decode_content: 103s If True, will attempt to decode the body based on the 103s 'content-encoding' header. 103s 103s :param enforce_content_length: 103s Enforce content length checking. Body returned by server must match 103s value of Content-Length header, if present. Otherwise, raise error. 103s """ 103s self.num_requests += 1 103s 103s timeout_obj = self._get_timeout(timeout) 103s timeout_obj.start_connect() 103s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 103s 103s try: 103s # Trigger any extra validation we need to do. 103s try: 103s self._validate_conn(conn) 103s except (SocketTimeout, BaseSSLError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 103s raise 103s 103s # _validate_conn() starts the connection to an HTTPS proxy 103s # so we need to wrap errors with 'ProxyError' here too. 103s except ( 103s OSError, 103s NewConnectionError, 103s TimeoutError, 103s BaseSSLError, 103s CertificateError, 103s SSLError, 103s ) as e: 103s new_e: Exception = e 103s if isinstance(e, (BaseSSLError, CertificateError)): 103s new_e = SSLError(e) 103s # If the connection didn't successfully connect to it's proxy 103s # then there 103s if isinstance( 103s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 103s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 103s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 103s raise new_e 103s 103s # conn.request() calls http.client.*.request, not the method in 103s # urllib3.request. It also calls makefile (recv) on the socket. 103s try: 103s conn.request( 103s method, 103s url, 103s body=body, 103s headers=headers, 103s chunked=chunked, 103s preload_content=preload_content, 103s decode_content=decode_content, 103s enforce_content_length=enforce_content_length, 103s ) 103s 103s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 103s # legitimately able to close the connection after sending a valid response. 103s # With this behaviour, the received response is still readable. 103s except BrokenPipeError: 103s pass 103s except OSError as e: 103s # MacOS/Linux 103s # EPROTOTYPE and ECONNRESET are needed on macOS 103s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 103s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 103s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 103s raise 103s 103s # Reset the timeout for the recv() on the socket 103s read_timeout = timeout_obj.read_timeout 103s 103s if not conn.is_closed: 103s # In Python 3 socket.py will catch EAGAIN and return None when you 103s # try and read into the file pointer created by http.client, which 103s # instead raises a BadStatusLine exception. Instead of catching 103s # the exception and assuming all BadStatusLine exceptions are read 103s # timeouts, check for a zero timeout before making the request. 103s if read_timeout == 0: 103s raise ReadTimeoutError( 103s self, url, f"Read timed out. (read timeout={read_timeout})" 103s ) 103s conn.timeout = read_timeout 103s 103s # Receive the response from the server 103s try: 103s response = conn.getresponse() 103s except (BaseSSLError, OSError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 103s raise 103s 103s # Set properties that are used by the pooling layer. 103s response.retries = retries 103s response._connection = response_conn # type: ignore[attr-defined] 103s response._pool = self # type: ignore[attr-defined] 103s 103s log.debug( 103s '%s://%s:%s "%s %s %s" %s %s', 103s self.scheme, 103s self.host, 103s self.port, 103s method, 103s url, 103s > response.version_string, 103s response.status, 103s response.length_remaining, 103s ) 103s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 103s 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 103s ----------------------------- Captured stderr call ----------------------------- 103s 127.0.0.1 - - [19/Jan/2025 20:23:17] "POST /post HTTP/1.1" 501 159 103s _______________________________ test_gzip[https] _______________________________ 103s 103s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_gzip_https_0') 103s httpbin_both = 103s verify_pool_mgr = 103s 103s def test_gzip(tmpdir, httpbin_both, verify_pool_mgr): 103s """ 103s Ensure that requests (actually urllib3) is able to automatically decompress 103s the response body 103s """ 103s url = httpbin_both.url + "/gzip" 103s response = verify_pool_mgr.request("GET", url) 103s 103s with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))): 103s > response = verify_pool_mgr.request("GET", url) 103s 103s tests/integration/test_urllib3.py:140: 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 103s return self.request_encode_url( 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 103s return self.urlopen(method, url, **extra_kw) 103s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 103s response = conn.urlopen(method, u.request_uri, **kw) 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 103s response = self._make_request( 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s 103s self = 103s conn = 103s method = 'GET', url = '/gzip', body = None, headers = {} 103s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 103s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 103s chunked = False, response_conn = None, preload_content = True 103s decode_content = True, enforce_content_length = True 103s 103s def _make_request( 103s self, 103s conn: BaseHTTPConnection, 103s method: str, 103s url: str, 103s body: _TYPE_BODY | None = None, 103s headers: typing.Mapping[str, str] | None = None, 103s retries: Retry | None = None, 103s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 103s chunked: bool = False, 103s response_conn: BaseHTTPConnection | None = None, 103s preload_content: bool = True, 103s decode_content: bool = True, 103s enforce_content_length: bool = True, 103s ) -> BaseHTTPResponse: 103s """ 103s Perform a request on a given urllib connection object taken from our 103s pool. 103s 103s :param conn: 103s a connection from one of our connection pools 103s 103s :param method: 103s HTTP request method (such as GET, POST, PUT, etc.) 103s 103s :param url: 103s The URL to perform the request on. 103s 103s :param body: 103s Data to send in the request body, either :class:`str`, :class:`bytes`, 103s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 103s 103s :param headers: 103s Dictionary of custom headers to send, such as User-Agent, 103s If-None-Match, etc. If None, pool headers are used. If provided, 103s these headers completely replace any pool-specific headers. 103s 103s :param retries: 103s Configure the number of retries to allow before raising a 103s :class:`~urllib3.exceptions.MaxRetryError` exception. 103s 103s Pass ``None`` to retry until you receive a response. Pass a 103s :class:`~urllib3.util.retry.Retry` object for fine-grained control 103s over different types of retries. 103s Pass an integer number to retry connection errors that many times, 103s but no other types of errors. Pass zero to never retry. 103s 103s If ``False``, then retries are disabled and any exception is raised 103s immediately. Also, instead of raising a MaxRetryError on redirects, 103s the redirect response will be returned. 103s 103s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 103s 103s :param timeout: 103s If specified, overrides the default timeout for this one 103s request. It may be a float (in seconds) or an instance of 103s :class:`urllib3.util.Timeout`. 103s 103s :param chunked: 103s If True, urllib3 will send the body using chunked transfer 103s encoding. Otherwise, urllib3 will send the body using the standard 103s content-length form. Defaults to False. 103s 103s :param response_conn: 103s Set this to ``None`` if you will handle releasing the connection or 103s set the connection to have the response release it. 103s 103s :param preload_content: 103s If True, the response's body will be preloaded during construction. 103s 103s :param decode_content: 103s If True, will attempt to decode the body based on the 103s 'content-encoding' header. 103s 103s :param enforce_content_length: 103s Enforce content length checking. Body returned by server must match 103s value of Content-Length header, if present. Otherwise, raise error. 103s """ 103s self.num_requests += 1 103s 103s timeout_obj = self._get_timeout(timeout) 103s timeout_obj.start_connect() 103s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 103s 103s try: 103s # Trigger any extra validation we need to do. 103s try: 103s self._validate_conn(conn) 103s except (SocketTimeout, BaseSSLError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 103s raise 103s 103s # _validate_conn() starts the connection to an HTTPS proxy 103s # so we need to wrap errors with 'ProxyError' here too. 103s except ( 103s OSError, 103s NewConnectionError, 103s TimeoutError, 103s BaseSSLError, 103s CertificateError, 103s SSLError, 103s ) as e: 103s new_e: Exception = e 103s if isinstance(e, (BaseSSLError, CertificateError)): 103s new_e = SSLError(e) 103s # If the connection didn't successfully connect to it's proxy 103s # then there 103s if isinstance( 103s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 103s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 103s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 103s raise new_e 103s 103s # conn.request() calls http.client.*.request, not the method in 103s # urllib3.request. It also calls makefile (recv) on the socket. 103s try: 103s conn.request( 103s method, 103s url, 103s body=body, 103s headers=headers, 103s chunked=chunked, 103s preload_content=preload_content, 103s decode_content=decode_content, 103s enforce_content_length=enforce_content_length, 103s ) 103s 103s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 103s # legitimately able to close the connection after sending a valid response. 103s # With this behaviour, the received response is still readable. 103s except BrokenPipeError: 103s pass 103s except OSError as e: 103s # MacOS/Linux 103s # EPROTOTYPE and ECONNRESET are needed on macOS 103s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 103s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 103s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 103s raise 103s 103s # Reset the timeout for the recv() on the socket 103s read_timeout = timeout_obj.read_timeout 103s 103s if not conn.is_closed: 103s # In Python 3 socket.py will catch EAGAIN and return None when you 103s # try and read into the file pointer created by http.client, which 103s # instead raises a BadStatusLine exception. Instead of catching 103s # the exception and assuming all BadStatusLine exceptions are read 103s # timeouts, check for a zero timeout before making the request. 103s if read_timeout == 0: 103s raise ReadTimeoutError( 103s self, url, f"Read timed out. (read timeout={read_timeout})" 103s ) 103s conn.timeout = read_timeout 103s 103s # Receive the response from the server 103s try: 103s response = conn.getresponse() 103s except (BaseSSLError, OSError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 103s raise 103s 103s # Set properties that are used by the pooling layer. 103s response.retries = retries 103s response._connection = response_conn # type: ignore[attr-defined] 103s response._pool = self # type: ignore[attr-defined] 103s 103s log.debug( 103s '%s://%s:%s "%s %s %s" %s %s', 103s self.scheme, 103s self.host, 103s self.port, 103s method, 103s url, 103s > response.version_string, 103s response.status, 103s response.length_remaining, 103s ) 103s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 103s 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 103s ----------------------------- Captured stderr call ----------------------------- 103s 127.0.0.1 - - [19/Jan/2025 20:23:18] "GET /gzip HTTP/1.1" 200 165 103s 127.0.0.1 - - [19/Jan/2025 20:23:18] "GET /gzip HTTP/1.1" 200 165 103s ________________________________ test_use_proxy ________________________________ 103s 103s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_use_proxy0') 103s httpbin = 103s proxy_server = 'http://0.0.0.0:45365' 103s 103s def test_use_proxy(tmpdir, httpbin, proxy_server): 103s """Ensure that it works with a proxy.""" 103s with vcr.use_cassette(str(tmpdir.join("proxy.yaml"))): 103s > response = requests.get(httpbin.url, proxies={"http": proxy_server}) 103s 103s tests/integration/test_proxy.py:53: 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s /usr/lib/python3/dist-packages/requests/api.py:73: in get 103s return request("get", url, params=params, **kwargs) 103s /usr/lib/python3/dist-packages/requests/api.py:59: in request 103s return session.request(method=method, url=url, **kwargs) 103s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 103s resp = self.send(prep, **send_kwargs) 103s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 103s r = adapter.send(request, **kwargs) 103s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 103s resp = conn.urlopen( 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 103s response = self._make_request( 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s 103s self = 103s conn = 103s method = 'GET', url = 'http://127.0.0.1:41093/', body = None 103s headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 103s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 103s timeout = Timeout(connect=None, read=None, total=None), chunked = False 103s response_conn = 103s preload_content = False, decode_content = False, enforce_content_length = True 103s 103s def _make_request( 103s self, 103s conn: BaseHTTPConnection, 103s method: str, 103s url: str, 103s body: _TYPE_BODY | None = None, 103s headers: typing.Mapping[str, str] | None = None, 103s retries: Retry | None = None, 103s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 103s chunked: bool = False, 103s response_conn: BaseHTTPConnection | None = None, 103s preload_content: bool = True, 103s decode_content: bool = True, 103s enforce_content_length: bool = True, 103s ) -> BaseHTTPResponse: 103s """ 103s Perform a request on a given urllib connection object taken from our 103s pool. 103s 103s :param conn: 103s a connection from one of our connection pools 103s 103s :param method: 103s HTTP request method (such as GET, POST, PUT, etc.) 103s 103s :param url: 103s The URL to perform the request on. 103s 103s :param body: 103s Data to send in the request body, either :class:`str`, :class:`bytes`, 103s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 103s 103s :param headers: 103s Dictionary of custom headers to send, such as User-Agent, 103s If-None-Match, etc. If None, pool headers are used. If provided, 103s these headers completely replace any pool-specific headers. 103s 103s :param retries: 103s Configure the number of retries to allow before raising a 103s :class:`~urllib3.exceptions.MaxRetryError` exception. 103s 103s Pass ``None`` to retry until you receive a response. Pass a 103s :class:`~urllib3.util.retry.Retry` object for fine-grained control 103s over different types of retries. 103s Pass an integer number to retry connection errors that many times, 103s but no other types of errors. Pass zero to never retry. 103s 103s If ``False``, then retries are disabled and any exception is raised 103s immediately. Also, instead of raising a MaxRetryError on redirects, 103s the redirect response will be returned. 103s 103s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 103s 103s :param timeout: 103s If specified, overrides the default timeout for this one 103s request. It may be a float (in seconds) or an instance of 103s :class:`urllib3.util.Timeout`. 103s 103s :param chunked: 103s If True, urllib3 will send the body using chunked transfer 103s encoding. Otherwise, urllib3 will send the body using the standard 103s content-length form. Defaults to False. 103s 103s :param response_conn: 103s Set this to ``None`` if you will handle releasing the connection or 103s set the connection to have the response release it. 103s 103s :param preload_content: 103s If True, the response's body will be preloaded during construction. 103s 103s :param decode_content: 103s If True, will attempt to decode the body based on the 103s 'content-encoding' header. 103s 103s :param enforce_content_length: 103s Enforce content length checking. Body returned by server must match 103s value of Content-Length header, if present. Otherwise, raise error. 103s """ 103s self.num_requests += 1 103s 103s timeout_obj = self._get_timeout(timeout) 103s timeout_obj.start_connect() 103s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 103s 103s try: 103s # Trigger any extra validation we need to do. 103s try: 103s self._validate_conn(conn) 103s except (SocketTimeout, BaseSSLError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 103s raise 103s 103s # _validate_conn() starts the connection to an HTTPS proxy 103s # so we need to wrap errors with 'ProxyError' here too. 103s except ( 103s OSError, 103s NewConnectionError, 103s TimeoutError, 103s BaseSSLError, 103s CertificateError, 103s SSLError, 103s ) as e: 103s new_e: Exception = e 103s if isinstance(e, (BaseSSLError, CertificateError)): 103s new_e = SSLError(e) 103s # If the connection didn't successfully connect to it's proxy 103s # then there 103s if isinstance( 103s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 103s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 103s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 103s raise new_e 103s 103s # conn.request() calls http.client.*.request, not the method in 103s # urllib3.request. It also calls makefile (recv) on the socket. 103s try: 103s conn.request( 103s method, 103s url, 103s body=body, 103s headers=headers, 103s chunked=chunked, 103s preload_content=preload_content, 103s decode_content=decode_content, 103s enforce_content_length=enforce_content_length, 103s ) 103s 103s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 103s # legitimately able to close the connection after sending a valid response. 103s # With this behaviour, the received response is still readable. 103s except BrokenPipeError: 103s pass 103s except OSError as e: 103s # MacOS/Linux 103s # EPROTOTYPE and ECONNRESET are needed on macOS 103s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 103s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 103s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 103s raise 103s 103s # Reset the timeout for the recv() on the socket 103s read_timeout = timeout_obj.read_timeout 103s 103s if not conn.is_closed: 103s # In Python 3 socket.py will catch EAGAIN and return None when you 103s # try and read into the file pointer created by http.client, which 103s # instead raises a BadStatusLine exception. Instead of catching 103s # the exception and assuming all BadStatusLine exceptions are read 103s # timeouts, check for a zero timeout before making the request. 103s if read_timeout == 0: 103s raise ReadTimeoutError( 103s self, url, f"Read timed out. (read timeout={read_timeout})" 103s ) 103s conn.timeout = read_timeout 103s 103s # Receive the response from the server 103s try: 103s response = conn.getresponse() 103s except (BaseSSLError, OSError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 103s raise 103s 103s # Set properties that are used by the pooling layer. 103s response.retries = retries 103s response._connection = response_conn # type: ignore[attr-defined] 103s response._pool = self # type: ignore[attr-defined] 103s 103s log.debug( 103s '%s://%s:%s "%s %s %s" %s %s', 103s self.scheme, 103s self.host, 103s self.port, 103s method, 103s url, 103s > response.version_string, 103s response.status, 103s response.length_remaining, 103s ) 103s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 103s 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 103s ----------------------------- Captured stderr call ----------------------------- 103s 127.0.0.1 - - [19/Jan/2025 20:23:18] "GET / HTTP/1.1" 200 9358 103s 127.0.0.1 - - [19/Jan/2025 20:23:18] "GET http://127.0.0.1:41093/ HTTP/1.1" 200 - 103s ______________________________ test_cross_scheme _______________________________ 103s 103s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_cross_scheme2') 103s httpbin = 103s httpbin_secure = 103s verify_pool_mgr = 103s 103s def test_cross_scheme(tmpdir, httpbin, httpbin_secure, verify_pool_mgr): 103s """Ensure that requests between schemes are treated separately""" 103s # First fetch a url under http, and then again under https and then 103s # ensure that we haven't served anything out of cache, and we have two 103s # requests / response pairs in the cassette 103s with vcr.use_cassette(str(tmpdir.join("cross_scheme.yaml"))) as cass: 103s > verify_pool_mgr.request("GET", httpbin_secure.url) 103s 103s tests/integration/test_urllib3.py:125: 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 103s return self.request_encode_url( 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 103s return self.urlopen(method, url, **extra_kw) 103s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 103s response = conn.urlopen(method, u.request_uri, **kw) 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 103s response = self._make_request( 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s 103s self = 103s conn = 103s method = 'GET', url = '/', body = None, headers = {} 103s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 103s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 103s chunked = False, response_conn = None, preload_content = True 103s decode_content = True, enforce_content_length = True 103s 103s def _make_request( 103s self, 103s conn: BaseHTTPConnection, 103s method: str, 103s url: str, 103s body: _TYPE_BODY | None = None, 103s headers: typing.Mapping[str, str] | None = None, 103s retries: Retry | None = None, 103s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 103s chunked: bool = False, 103s response_conn: BaseHTTPConnection | None = None, 103s preload_content: bool = True, 103s decode_content: bool = True, 103s enforce_content_length: bool = True, 103s ) -> BaseHTTPResponse: 103s """ 103s Perform a request on a given urllib connection object taken from our 103s pool. 103s 103s :param conn: 103s a connection from one of our connection pools 103s 103s :param method: 103s HTTP request method (such as GET, POST, PUT, etc.) 103s 103s :param url: 103s The URL to perform the request on. 103s 103s :param body: 103s Data to send in the request body, either :class:`str`, :class:`bytes`, 103s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 103s 103s :param headers: 103s Dictionary of custom headers to send, such as User-Agent, 103s If-None-Match, etc. If None, pool headers are used. If provided, 103s these headers completely replace any pool-specific headers. 103s 103s :param retries: 103s Configure the number of retries to allow before raising a 103s :class:`~urllib3.exceptions.MaxRetryError` exception. 103s 103s Pass ``None`` to retry until you receive a response. Pass a 103s :class:`~urllib3.util.retry.Retry` object for fine-grained control 103s over different types of retries. 103s Pass an integer number to retry connection errors that many times, 103s but no other types of errors. Pass zero to never retry. 103s 103s If ``False``, then retries are disabled and any exception is raised 103s immediately. Also, instead of raising a MaxRetryError on redirects, 103s the redirect response will be returned. 103s 103s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 103s 103s :param timeout: 103s If specified, overrides the default timeout for this one 103s request. It may be a float (in seconds) or an instance of 103s :class:`urllib3.util.Timeout`. 103s 103s :param chunked: 103s If True, urllib3 will send the body using chunked transfer 103s encoding. Otherwise, urllib3 will send the body using the standard 103s content-length form. Defaults to False. 103s 103s :param response_conn: 103s Set this to ``None`` if you will handle releasing the connection or 103s set the connection to have the response release it. 103s 103s :param preload_content: 103s If True, the response's body will be preloaded during construction. 103s 103s :param decode_content: 103s If True, will attempt to decode the body based on the 103s 'content-encoding' header. 103s 103s :param enforce_content_length: 103s Enforce content length checking. Body returned by server must match 103s value of Content-Length header, if present. Otherwise, raise error. 103s """ 103s self.num_requests += 1 103s 103s timeout_obj = self._get_timeout(timeout) 103s timeout_obj.start_connect() 103s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 103s 103s try: 103s # Trigger any extra validation we need to do. 103s try: 103s self._validate_conn(conn) 103s except (SocketTimeout, BaseSSLError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 103s raise 103s 103s # _validate_conn() starts the connection to an HTTPS proxy 103s # so we need to wrap errors with 'ProxyError' here too. 103s except ( 103s OSError, 103s NewConnectionError, 103s TimeoutError, 103s BaseSSLError, 103s CertificateError, 103s SSLError, 103s ) as e: 103s new_e: Exception = e 103s if isinstance(e, (BaseSSLError, CertificateError)): 103s new_e = SSLError(e) 103s # If the connection didn't successfully connect to it's proxy 103s # then there 103s if isinstance( 103s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 103s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 103s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 103s raise new_e 103s 103s # conn.request() calls http.client.*.request, not the method in 103s # urllib3.request. It also calls makefile (recv) on the socket. 103s try: 103s conn.request( 103s method, 103s url, 103s body=body, 103s headers=headers, 103s chunked=chunked, 103s preload_content=preload_content, 103s decode_content=decode_content, 103s enforce_content_length=enforce_content_length, 103s ) 103s 103s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 103s # legitimately able to close the connection after sending a valid response. 103s # With this behaviour, the received response is still readable. 103s except BrokenPipeError: 103s pass 103s except OSError as e: 103s # MacOS/Linux 103s # EPROTOTYPE and ECONNRESET are needed on macOS 103s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 103s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 103s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 103s raise 103s 103s # Reset the timeout for the recv() on the socket 103s read_timeout = timeout_obj.read_timeout 103s 103s if not conn.is_closed: 103s # In Python 3 socket.py will catch EAGAIN and return None when you 103s # try and read into the file pointer created by http.client, which 103s # instead raises a BadStatusLine exception. Instead of catching 103s # the exception and assuming all BadStatusLine exceptions are read 103s # timeouts, check for a zero timeout before making the request. 103s if read_timeout == 0: 103s raise ReadTimeoutError( 103s self, url, f"Read timed out. (read timeout={read_timeout})" 103s ) 103s conn.timeout = read_timeout 103s 103s # Receive the response from the server 103s try: 103s response = conn.getresponse() 103s except (BaseSSLError, OSError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 103s raise 103s 103s # Set properties that are used by the pooling layer. 103s response.retries = retries 103s response._connection = response_conn # type: ignore[attr-defined] 103s response._pool = self # type: ignore[attr-defined] 103s 103s log.debug( 103s '%s://%s:%s "%s %s %s" %s %s', 103s self.scheme, 103s self.host, 103s self.port, 103s method, 103s url, 103s > response.version_string, 103s response.status, 103s response.length_remaining, 103s ) 103s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 103s 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 103s ----------------------------- Captured stderr call ----------------------------- 103s 127.0.0.1 - - [19/Jan/2025 20:23:18] "GET / HTTP/1.1" 200 9358 103s ___________________ test_https_with_cert_validation_disabled ___________________ 103s 103s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_https_with_cert_validatio0') 103s httpbin_secure = 103s pool_mgr = 103s 103s def test_https_with_cert_validation_disabled(tmpdir, httpbin_secure, pool_mgr): 103s with vcr.use_cassette(str(tmpdir.join("cert_validation_disabled.yaml"))): 103s > pool_mgr.request("GET", httpbin_secure.url) 103s 103s tests/integration/test_urllib3.py:149: 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 103s return self.request_encode_url( 103s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 103s return self.urlopen(method, url, **extra_kw) 103s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 103s response = conn.urlopen(method, u.request_uri, **kw) 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 103s response = self._make_request( 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s 103s self = 103s conn = 103s method = 'GET', url = '/', body = None, headers = {} 103s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 103s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 103s chunked = False, response_conn = None, preload_content = True 103s decode_content = True, enforce_content_length = True 103s 103s def _make_request( 103s self, 103s conn: BaseHTTPConnection, 103s method: str, 103s url: str, 103s body: _TYPE_BODY | None = None, 103s headers: typing.Mapping[str, str] | None = None, 103s retries: Retry | None = None, 103s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 103s chunked: bool = False, 103s response_conn: BaseHTTPConnection | None = None, 103s preload_content: bool = True, 103s decode_content: bool = True, 103s enforce_content_length: bool = True, 103s ) -> BaseHTTPResponse: 103s """ 103s Perform a request on a given urllib connection object taken from our 103s pool. 103s 103s :param conn: 103s a connection from one of our connection pools 103s 103s :param method: 103s HTTP request method (such as GET, POST, PUT, etc.) 103s 103s :param url: 103s The URL to perform the request on. 103s 103s :param body: 103s Data to send in the request body, either :class:`str`, :class:`bytes`, 103s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 103s 103s :param headers: 103s Dictionary of custom headers to send, such as User-Agent, 103s If-None-Match, etc. If None, pool headers are used. If provided, 103s these headers completely replace any pool-specific headers. 103s 103s :param retries: 103s Configure the number of retries to allow before raising a 103s :class:`~urllib3.exceptions.MaxRetryError` exception. 103s 103s Pass ``None`` to retry until you receive a response. Pass a 103s :class:`~urllib3.util.retry.Retry` object for fine-grained control 103s over different types of retries. 103s Pass an integer number to retry connection errors that many times, 103s but no other types of errors. Pass zero to never retry. 103s 103s If ``False``, then retries are disabled and any exception is raised 103s immediately. Also, instead of raising a MaxRetryError on redirects, 103s the redirect response will be returned. 103s 103s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 103s 103s :param timeout: 103s If specified, overrides the default timeout for this one 103s request. It may be a float (in seconds) or an instance of 103s :class:`urllib3.util.Timeout`. 103s 103s :param chunked: 103s If True, urllib3 will send the body using chunked transfer 103s encoding. Otherwise, urllib3 will send the body using the standard 103s content-length form. Defaults to False. 103s 103s :param response_conn: 103s Set this to ``None`` if you will handle releasing the connection or 103s set the connection to have the response release it. 103s 103s :param preload_content: 103s If True, the response's body will be preloaded during construction. 103s 103s :param decode_content: 103s If True, will attempt to decode the body based on the 103s 'content-encoding' header. 103s 103s :param enforce_content_length: 103s Enforce content length checking. Body returned by server must match 103s value of Content-Length header, if present. Otherwise, raise error. 103s """ 103s self.num_requests += 1 103s 103s timeout_obj = self._get_timeout(timeout) 103s timeout_obj.start_connect() 103s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 103s 103s try: 103s # Trigger any extra validation we need to do. 103s try: 103s self._validate_conn(conn) 103s except (SocketTimeout, BaseSSLError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 103s raise 103s 103s # _validate_conn() starts the connection to an HTTPS proxy 103s # so we need to wrap errors with 'ProxyError' here too. 103s except ( 103s OSError, 103s NewConnectionError, 103s TimeoutError, 103s BaseSSLError, 103s CertificateError, 103s SSLError, 103s ) as e: 103s new_e: Exception = e 103s if isinstance(e, (BaseSSLError, CertificateError)): 103s new_e = SSLError(e) 103s # If the connection didn't successfully connect to it's proxy 103s # then there 103s if isinstance( 103s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 103s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 103s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 103s raise new_e 103s 103s # conn.request() calls http.client.*.request, not the method in 103s # urllib3.request. It also calls makefile (recv) on the socket. 103s try: 103s conn.request( 103s method, 103s url, 103s body=body, 103s headers=headers, 103s chunked=chunked, 103s preload_content=preload_content, 103s decode_content=decode_content, 103s enforce_content_length=enforce_content_length, 103s ) 103s 103s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 103s # legitimately able to close the connection after sending a valid response. 103s # With this behaviour, the received response is still readable. 103s except BrokenPipeError: 103s pass 103s except OSError as e: 103s # MacOS/Linux 103s # EPROTOTYPE and ECONNRESET are needed on macOS 103s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 103s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 103s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 103s raise 103s 103s # Reset the timeout for the recv() on the socket 103s read_timeout = timeout_obj.read_timeout 103s 103s if not conn.is_closed: 103s # In Python 3 socket.py will catch EAGAIN and return None when you 103s # try and read into the file pointer created by http.client, which 103s # instead raises a BadStatusLine exception. Instead of catching 103s # the exception and assuming all BadStatusLine exceptions are read 103s # timeouts, check for a zero timeout before making the request. 103s if read_timeout == 0: 103s raise ReadTimeoutError( 103s self, url, f"Read timed out. (read timeout={read_timeout})" 103s ) 103s conn.timeout = read_timeout 103s 103s # Receive the response from the server 103s try: 103s response = conn.getresponse() 103s except (BaseSSLError, OSError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 103s raise 103s 103s # Set properties that are used by the pooling layer. 103s response.retries = retries 103s response._connection = response_conn # type: ignore[attr-defined] 103s response._pool = self # type: ignore[attr-defined] 103s 103s log.debug( 103s '%s://%s:%s "%s %s %s" %s %s', 103s self.scheme, 103s self.host, 103s self.port, 103s method, 103s url, 103s > response.version_string, 103s response.status, 103s response.length_remaining, 103s ) 103s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 103s 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 103s ----------------------------- Captured stderr call ----------------------------- 103s 127.0.0.1 - - [19/Jan/2025 20:23:18] "GET / HTTP/1.1" 200 9358 103s _____________________________ test_domain_redirect _____________________________ 103s 103s def test_domain_redirect(): 103s """Ensure that redirects across domains are considered unique""" 103s # In this example, seomoz.org redirects to moz.com, and if those 103s # requests are considered identical, then we'll be stuck in a redirect 103s # loop. 103s url = "http://seomoz.org/" 103s with vcr.use_cassette("tests/fixtures/wild/domain_redirect.yaml") as cass: 103s > requests.get(url, headers={"User-Agent": "vcrpy-test"}) 103s 103s tests/integration/test_wild.py:20: 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s /usr/lib/python3/dist-packages/requests/api.py:73: in get 103s return request("get", url, params=params, **kwargs) 103s /usr/lib/python3/dist-packages/requests/api.py:59: in request 103s return session.request(method=method, url=url, **kwargs) 103s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 103s resp = self.send(prep, **send_kwargs) 103s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 103s r = adapter.send(request, **kwargs) 103s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 103s resp = conn.urlopen( 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 103s response = self._make_request( 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s 103s self = 103s conn = 103s method = 'GET', url = '/', body = None 103s headers = {'User-Agent': 'vcrpy-test', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 103s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 103s timeout = Timeout(connect=None, read=None, total=None), chunked = False 103s response_conn = 103s preload_content = False, decode_content = False, enforce_content_length = True 103s 103s def _make_request( 103s self, 103s conn: BaseHTTPConnection, 103s method: str, 103s url: str, 103s body: _TYPE_BODY | None = None, 103s headers: typing.Mapping[str, str] | None = None, 103s retries: Retry | None = None, 103s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 103s chunked: bool = False, 103s response_conn: BaseHTTPConnection | None = None, 103s preload_content: bool = True, 103s decode_content: bool = True, 103s enforce_content_length: bool = True, 103s ) -> BaseHTTPResponse: 103s """ 103s Perform a request on a given urllib connection object taken from our 103s pool. 103s 103s :param conn: 103s a connection from one of our connection pools 103s 103s :param method: 103s HTTP request method (such as GET, POST, PUT, etc.) 103s 103s :param url: 103s The URL to perform the request on. 103s 103s :param body: 103s Data to send in the request body, either :class:`str`, :class:`bytes`, 103s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 103s 103s :param headers: 103s Dictionary of custom headers to send, such as User-Agent, 103s If-None-Match, etc. If None, pool headers are used. If provided, 103s these headers completely replace any pool-specific headers. 103s 103s :param retries: 103s Configure the number of retries to allow before raising a 103s :class:`~urllib3.exceptions.MaxRetryError` exception. 103s 103s Pass ``None`` to retry until you receive a response. Pass a 103s :class:`~urllib3.util.retry.Retry` object for fine-grained control 103s over different types of retries. 103s Pass an integer number to retry connection errors that many times, 103s but no other types of errors. Pass zero to never retry. 103s 103s If ``False``, then retries are disabled and any exception is raised 103s immediately. Also, instead of raising a MaxRetryError on redirects, 103s the redirect response will be returned. 103s 103s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 103s 103s :param timeout: 103s If specified, overrides the default timeout for this one 103s request. It may be a float (in seconds) or an instance of 103s :class:`urllib3.util.Timeout`. 103s 103s :param chunked: 103s If True, urllib3 will send the body using chunked transfer 103s encoding. Otherwise, urllib3 will send the body using the standard 103s content-length form. Defaults to False. 103s 103s :param response_conn: 103s Set this to ``None`` if you will handle releasing the connection or 103s set the connection to have the response release it. 103s 103s :param preload_content: 103s If True, the response's body will be preloaded during construction. 103s 103s :param decode_content: 103s If True, will attempt to decode the body based on the 103s 'content-encoding' header. 103s 103s :param enforce_content_length: 103s Enforce content length checking. Body returned by server must match 103s value of Content-Length header, if present. Otherwise, raise error. 103s """ 103s self.num_requests += 1 103s 103s timeout_obj = self._get_timeout(timeout) 103s timeout_obj.start_connect() 103s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 103s 103s try: 103s # Trigger any extra validation we need to do. 103s try: 103s self._validate_conn(conn) 103s except (SocketTimeout, BaseSSLError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 103s raise 103s 103s # _validate_conn() starts the connection to an HTTPS proxy 103s # so we need to wrap errors with 'ProxyError' here too. 103s except ( 103s OSError, 103s NewConnectionError, 103s TimeoutError, 103s BaseSSLError, 103s CertificateError, 103s SSLError, 103s ) as e: 103s new_e: Exception = e 103s if isinstance(e, (BaseSSLError, CertificateError)): 103s new_e = SSLError(e) 103s # If the connection didn't successfully connect to it's proxy 103s # then there 103s if isinstance( 103s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 103s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 103s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 103s raise new_e 103s 103s # conn.request() calls http.client.*.request, not the method in 103s # urllib3.request. It also calls makefile (recv) on the socket. 103s try: 103s conn.request( 103s method, 103s url, 103s body=body, 103s headers=headers, 103s chunked=chunked, 103s preload_content=preload_content, 103s decode_content=decode_content, 103s enforce_content_length=enforce_content_length, 103s ) 103s 103s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 103s # legitimately able to close the connection after sending a valid response. 103s # With this behaviour, the received response is still readable. 103s except BrokenPipeError: 103s pass 103s except OSError as e: 103s # MacOS/Linux 103s # EPROTOTYPE and ECONNRESET are needed on macOS 103s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 103s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 103s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 103s raise 103s 103s # Reset the timeout for the recv() on the socket 103s read_timeout = timeout_obj.read_timeout 103s 103s if not conn.is_closed: 103s # In Python 3 socket.py will catch EAGAIN and return None when you 103s # try and read into the file pointer created by http.client, which 103s # instead raises a BadStatusLine exception. Instead of catching 103s # the exception and assuming all BadStatusLine exceptions are read 103s # timeouts, check for a zero timeout before making the request. 103s if read_timeout == 0: 103s raise ReadTimeoutError( 103s self, url, f"Read timed out. (read timeout={read_timeout})" 103s ) 103s conn.timeout = read_timeout 103s 103s # Receive the response from the server 103s try: 103s response = conn.getresponse() 103s except (BaseSSLError, OSError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 103s raise 103s 103s # Set properties that are used by the pooling layer. 103s response.retries = retries 103s response._connection = response_conn # type: ignore[attr-defined] 103s response._pool = self # type: ignore[attr-defined] 103s 103s log.debug( 103s '%s://%s:%s "%s %s %s" %s %s', 103s self.scheme, 103s self.host, 103s self.port, 103s method, 103s url, 103s > response.version_string, 103s response.status, 103s response.length_remaining, 103s ) 103s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 103s 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 103s _________________________________ test_cookies _________________________________ 103s 103s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_cookies0') 103s httpbin = 103s 103s def test_cookies(tmpdir, httpbin): 103s testfile = str(tmpdir.join("cookies.yml")) 103s with vcr.use_cassette(testfile): 103s with requests.Session() as s: 103s > s.get(httpbin.url + "/cookies/set?k1=v1&k2=v2") 103s 103s tests/integration/test_wild.py:67: 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 103s return self.request("GET", url, **kwargs) 103s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 103s resp = self.send(prep, **send_kwargs) 103s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 103s r = adapter.send(request, **kwargs) 103s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 103s resp = conn.urlopen( 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 103s response = self._make_request( 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s 103s self = 103s conn = 103s method = 'GET', url = '/cookies/set?k1=v1&k2=v2', body = None 103s headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 103s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 103s timeout = Timeout(connect=None, read=None, total=None), chunked = False 103s response_conn = 103s preload_content = False, decode_content = False, enforce_content_length = True 103s 103s def _make_request( 103s self, 103s conn: BaseHTTPConnection, 103s method: str, 103s url: str, 103s body: _TYPE_BODY | None = None, 103s headers: typing.Mapping[str, str] | None = None, 103s retries: Retry | None = None, 103s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 103s chunked: bool = False, 103s response_conn: BaseHTTPConnection | None = None, 103s preload_content: bool = True, 103s decode_content: bool = True, 103s enforce_content_length: bool = True, 103s ) -> BaseHTTPResponse: 103s """ 103s Perform a request on a given urllib connection object taken from our 103s pool. 103s 103s :param conn: 103s a connection from one of our connection pools 103s 103s :param method: 103s HTTP request method (such as GET, POST, PUT, etc.) 103s 103s :param url: 103s The URL to perform the request on. 103s 103s :param body: 103s Data to send in the request body, either :class:`str`, :class:`bytes`, 103s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 103s 103s :param headers: 103s Dictionary of custom headers to send, such as User-Agent, 103s If-None-Match, etc. If None, pool headers are used. If provided, 103s these headers completely replace any pool-specific headers. 103s 103s :param retries: 103s Configure the number of retries to allow before raising a 103s :class:`~urllib3.exceptions.MaxRetryError` exception. 103s 103s Pass ``None`` to retry until you receive a response. Pass a 103s :class:`~urllib3.util.retry.Retry` object for fine-grained control 103s over different types of retries. 103s Pass an integer number to retry connection errors that many times, 103s but no other types of errors. Pass zero to never retry. 103s 103s If ``False``, then retries are disabled and any exception is raised 103s immediately. Also, instead of raising a MaxRetryError on redirects, 103s the redirect response will be returned. 103s 103s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 103s 103s :param timeout: 103s If specified, overrides the default timeout for this one 103s request. It may be a float (in seconds) or an instance of 103s :class:`urllib3.util.Timeout`. 103s 103s :param chunked: 103s If True, urllib3 will send the body using chunked transfer 103s encoding. Otherwise, urllib3 will send the body using the standard 103s content-length form. Defaults to False. 103s 103s :param response_conn: 103s Set this to ``None`` if you will handle releasing the connection or 103s set the connection to have the response release it. 103s 103s :param preload_content: 103s If True, the response's body will be preloaded during construction. 103s 103s :param decode_content: 103s If True, will attempt to decode the body based on the 103s 'content-encoding' header. 103s 103s :param enforce_content_length: 103s Enforce content length checking. Body returned by server must match 103s value of Content-Length header, if present. Otherwise, raise error. 103s """ 103s self.num_requests += 1 103s 103s timeout_obj = self._get_timeout(timeout) 103s timeout_obj.start_connect() 103s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 103s 103s try: 103s # Trigger any extra validation we need to do. 103s try: 103s self._validate_conn(conn) 103s except (SocketTimeout, BaseSSLError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 103s raise 103s 103s # _validate_conn() starts the connection to an HTTPS proxy 103s # so we need to wrap errors with 'ProxyError' here too. 103s except ( 103s OSError, 103s NewConnectionError, 103s TimeoutError, 103s BaseSSLError, 103s CertificateError, 103s SSLError, 103s ) as e: 103s new_e: Exception = e 103s if isinstance(e, (BaseSSLError, CertificateError)): 103s new_e = SSLError(e) 103s # If the connection didn't successfully connect to it's proxy 103s # then there 103s if isinstance( 103s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 103s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 103s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 103s raise new_e 103s 103s # conn.request() calls http.client.*.request, not the method in 103s # urllib3.request. It also calls makefile (recv) on the socket. 103s try: 103s conn.request( 103s method, 103s url, 103s body=body, 103s headers=headers, 103s chunked=chunked, 103s preload_content=preload_content, 103s decode_content=decode_content, 103s enforce_content_length=enforce_content_length, 103s ) 103s 103s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 103s # legitimately able to close the connection after sending a valid response. 103s # With this behaviour, the received response is still readable. 103s except BrokenPipeError: 103s pass 103s except OSError as e: 103s # MacOS/Linux 103s # EPROTOTYPE and ECONNRESET are needed on macOS 103s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 103s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 103s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 103s raise 103s 103s # Reset the timeout for the recv() on the socket 103s read_timeout = timeout_obj.read_timeout 103s 103s if not conn.is_closed: 103s # In Python 3 socket.py will catch EAGAIN and return None when you 103s # try and read into the file pointer created by http.client, which 103s # instead raises a BadStatusLine exception. Instead of catching 103s # the exception and assuming all BadStatusLine exceptions are read 103s # timeouts, check for a zero timeout before making the request. 103s if read_timeout == 0: 103s raise ReadTimeoutError( 103s self, url, f"Read timed out. (read timeout={read_timeout})" 103s ) 103s conn.timeout = read_timeout 103s 103s # Receive the response from the server 103s try: 103s response = conn.getresponse() 103s except (BaseSSLError, OSError) as e: 103s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 103s raise 103s 103s # Set properties that are used by the pooling layer. 103s response.retries = retries 103s response._connection = response_conn # type: ignore[attr-defined] 103s response._pool = self # type: ignore[attr-defined] 103s 103s log.debug( 103s '%s://%s:%s "%s %s %s" %s %s', 103s self.scheme, 103s self.host, 103s self.port, 103s method, 103s url, 103s > response.version_string, 103s response.status, 103s response.length_remaining, 103s ) 103s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 103s 103s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 103s ----------------------------- Captured stderr call ----------------------------- 103s 127.0.0.1 - - [19/Jan/2025 20:23:18] "GET /cookies/set?k1=v1&k2=v2 HTTP/1.1" 302 203 103s _______________ TestVCRConnection.test_body_consumed_once_stream _______________ 103s 103s self = 103s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_body_consumed_once_stream0') 103s httpbin = 103s 103s def test_body_consumed_once_stream(self, tmpdir, httpbin): 103s > self._test_body_consumed_once( 103s tmpdir, 103s httpbin, 103s BytesIO(b"1234567890"), 103s BytesIO(b"9876543210"), 103s BytesIO(b"9876543210"), 103s ) 103s 103s tests/unit/test_stubs.py:29: 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s tests/unit/test_stubs.py:69: in _test_body_consumed_once 103s conn2.getresponse() 103s /usr/lib/python3/dist-packages/vcr/stubs/__init__.py:277: in getresponse 103s self.real_connection.request( 103s /usr/lib/python3.13/http/client.py:1336: in request 103s self._send_request(method, url, body, headers, encode_chunked) 103s /usr/lib/python3.13/http/client.py:1382: in _send_request 103s self.endheaders(body, encode_chunked=encode_chunked) 103s /usr/lib/python3.13/http/client.py:1331: in endheaders 103s self._send_output(message_body, encode_chunked=encode_chunked) 103s /usr/lib/python3.13/http/client.py:1134: in _send_output 103s self.send(b'0\r\n\r\n') 103s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 103s 103s self = 103s data = b'0\r\n\r\n' 103s 103s def send(self, data): 103s """Send `data' to the server. 103s ``data`` can be a string object, a bytes object, an array object, a 103s file-like object that supports a .read() method, or an iterable object. 103s """ 103s 103s if self.sock is None: 103s if self.auto_open: 103s self.connect() 103s else: 103s raise NotConnected() 103s 103s if self.debuglevel > 0: 103s print("send:", repr(data)) 103s if hasattr(data, "read") : 103s if self.debuglevel > 0: 103s print("sending a readable") 103s encode = self._is_textIO(data) 103s if encode and self.debuglevel > 0: 103s print("encoding file using iso-8859-1") 103s while datablock := data.read(self.blocksize): 103s if encode: 103s datablock = datablock.encode("iso-8859-1") 103s sys.audit("http.client.send", self, datablock) 103s self.sock.sendall(datablock) 103s return 103s sys.audit("http.client.send", self, data) 103s try: 103s > self.sock.sendall(data) 103s E BrokenPipeError: [Errno 32] Broken pipe 103s 103s /usr/lib/python3.13/http/client.py:1055: BrokenPipeError 103s ----------------------------- Captured stderr call ----------------------------- 103s 127.0.0.1 - - [19/Jan/2025 20:23:18] "POST /anything HTTP/1.1" 501 159 103s 127.0.0.1 - - [19/Jan/2025 20:23:18] "POST /anything HTTP/1.1" 501 159 103s =============================== warnings summary =============================== 103s tests/integration/test_config.py:10 103s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_config.py:10: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 103s @pytest.mark.online 103s 103s tests/integration/test_config.py:24 103s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_config.py:24: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 103s @pytest.mark.online 103s 103s tests/integration/test_config.py:34 103s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_config.py:34: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 103s @pytest.mark.online 103s 103s tests/integration/test_config.py:47 103s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_config.py:47: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 103s @pytest.mark.online 103s 103s tests/integration/test_config.py:69 103s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_config.py:69: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 103s @pytest.mark.online 103s 103s tests/integration/test_disksaver.py:14 103s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_disksaver.py:14: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 103s @pytest.mark.online 103s 103s tests/integration/test_disksaver.py:35 103s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_disksaver.py:35: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 103s @pytest.mark.online 103s 103s tests/integration/test_httplib2.py:60 103s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_httplib2.py:60: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 103s @pytest.mark.online 103s 103s tests/integration/test_register_matcher.py:16 103s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_register_matcher.py:16: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 103s @pytest.mark.online 103s 103s tests/integration/test_register_matcher.py:32 103s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_register_matcher.py:32: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 103s @pytest.mark.online 103s 103s tests/integration/test_urllib2.py:60 103s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_urllib2.py:60: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 103s @mark.online 103s 103s tests/integration/test_urllib3.py:102 103s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_urllib3.py:102: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 103s @pytest.mark.online 103s 103s tests/integration/test_wild.py:55 103s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_wild.py:55: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 103s @pytest.mark.online 103s 103s tests/integration/test_wild.py:74 103s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_wild.py:74: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 103s @pytest.mark.online 103s 103s tests/unit/test_stubs.py:20 103s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/unit/test_stubs.py:20: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 103s @mark.online 103s 103s tests/unit/test_unittest.py:131 103s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/unit/test_unittest.py:131: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 103s @pytest.mark.online 103s 103s tests/unit/test_unittest.py:166 103s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/unit/test_unittest.py:166: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 103s @pytest.mark.online 103s 103s tests/integration/test_wild.py::test_xmlrpclib 103s /usr/lib/python3.13/multiprocessing/popen_fork.py:67: DeprecationWarning: This process (pid=3906) is multi-threaded, use of fork() may lead to deadlocks in the child. 103s self.pid = os.fork() 103s 103s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 103s =========================== short test summary info ============================ 103s FAILED tests/integration/test_urllib3.py::test_status_code[http] - AttributeE... 103s FAILED tests/integration/test_urllib3.py::test_headers[http] - AttributeError... 103s FAILED tests/integration/test_urllib3.py::test_body[http] - AttributeError: '... 103s FAILED tests/integration/test_urllib3.py::test_auth[http] - AttributeError: '... 103s FAILED tests/integration/test_urllib3.py::test_auth_failed[http] - AttributeE... 103s FAILED tests/integration/test_urllib3.py::test_post[http] - AttributeError: '... 103s FAILED tests/integration/test_urllib3.py::test_gzip[http] - AttributeError: '... 103s FAILED tests/integration/test_urllib3.py::test_status_code[https] - Attribute... 103s FAILED tests/integration/test_urllib3.py::test_headers[https] - AttributeErro... 103s FAILED tests/integration/test_urllib3.py::test_body[https] - AttributeError: ... 103s FAILED tests/integration/test_urllib3.py::test_auth[https] - AttributeError: ... 103s FAILED tests/integration/test_urllib3.py::test_auth_failed[https] - Attribute... 103s FAILED tests/integration/test_urllib3.py::test_post[https] - AttributeError: ... 103s FAILED tests/integration/test_urllib3.py::test_gzip[https] - AttributeError: ... 103s FAILED tests/integration/test_proxy.py::test_use_proxy - AttributeError: 'VCR... 103s FAILED tests/integration/test_urllib3.py::test_cross_scheme - AttributeError:... 103s FAILED tests/integration/test_urllib3.py::test_https_with_cert_validation_disabled 103s FAILED tests/integration/test_wild.py::test_domain_redirect - AttributeError:... 103s FAILED tests/integration/test_wild.py::test_cookies - AttributeError: 'VCRHTT... 103s FAILED tests/unit/test_stubs.py::TestVCRConnection::test_body_consumed_once_stream 103s ==== 20 failed, 264 passed, 3 skipped, 19 deselected, 18 warnings in 3.72s ===== 104s E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build; python3.13 -m pytest --ignore tests/integration/test_aiohttp.py --ignore tests/integration/test_tornado.py --ignore tests/integration/test_requests.py -m "not online" -k "not test_basic_json_use and not test_load_cassette_with_custom_persister" 104s I: pybuild base:311: cd /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build; python3.12 -m pytest --ignore tests/integration/test_aiohttp.py --ignore tests/integration/test_tornado.py --ignore tests/integration/test_requests.py -m "not online" -k "not test_basic_json_use and not test_load_cassette_with_custom_persister" 105s ============================= test session starts ============================== 105s platform linux -- Python 3.12.8, pytest-8.3.4, pluggy-1.5.0 105s rootdir: /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build 105s plugins: httpbin-2.1.0, tornado-0.8.1, typeguard-4.4.1 105s collected 305 items / 19 deselected / 1 skipped / 286 selected 105s 105s tests/integration/test_basic.py .... [ 1%] 105s tests/integration/test_boto3.py ss [ 2%] 105s tests/integration/test_config.py . [ 2%] 105s tests/integration/test_filter.py .......... [ 5%] 105s tests/integration/test_httplib2.py ........ [ 8%] 105s tests/integration/test_urllib2.py ........ [ 11%] 105s tests/integration/test_urllib3.py FFFFFFF [ 13%] 105s tests/integration/test_httplib2.py ........ [ 16%] 105s tests/integration/test_urllib2.py ........ [ 19%] 105s tests/integration/test_urllib3.py FFFFFFF [ 22%] 105s tests/integration/test_httplib2.py . [ 22%] 105s tests/integration/test_ignore.py .... [ 23%] 106s tests/integration/test_matchers.py .............. [ 28%] 106s tests/integration/test_multiple.py . [ 29%] 106s tests/integration/test_proxy.py F [ 29%] 106s tests/integration/test_record_mode.py ........ [ 32%] 106s tests/integration/test_register_persister.py .. [ 32%] 106s tests/integration/test_register_serializer.py . [ 33%] 106s tests/integration/test_request.py .. [ 33%] 106s tests/integration/test_stubs.py .... [ 35%] 106s tests/integration/test_urllib2.py . [ 35%] 106s tests/integration/test_urllib3.py FF. [ 36%] 106s tests/integration/test_wild.py F.F. [ 38%] 106s tests/unit/test_cassettes.py ............................... [ 48%] 106s tests/unit/test_errors.py .... [ 50%] 106s tests/unit/test_filters.py ........................ [ 58%] 106s tests/unit/test_json_serializer.py . [ 59%] 106s tests/unit/test_matchers.py ............................ [ 68%] 106s tests/unit/test_migration.py ... [ 69%] 106s tests/unit/test_persist.py .... [ 71%] 106s tests/unit/test_request.py ................. [ 77%] 106s tests/unit/test_response.py .... [ 78%] 106s tests/unit/test_serialize.py ............... [ 83%] 106s tests/unit/test_stubs.py ... [ 84%] 106s tests/unit/test_unittest.py ....... [ 87%] 106s tests/unit/test_util.py ........... [ 91%] 106s tests/unit/test_vcr.py ........................ [ 99%] 107s tests/unit/test_vcr_import.py . [100%] 107s 107s =================================== FAILURES =================================== 107s ____________________________ test_status_code[http] ____________________________ 107s 107s httpbin_both = 107s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_status_code_http_0') 107s verify_pool_mgr = 107s 107s def test_status_code(httpbin_both, tmpdir, verify_pool_mgr): 107s """Ensure that we can read the status code""" 107s url = httpbin_both.url 107s with vcr.use_cassette(str(tmpdir.join("atts.yaml"))): 107s > status_code = verify_pool_mgr.request("GET", url).status 107s 107s tests/integration/test_urllib3.py:34: 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 107s return self.request_encode_url( 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 107s return self.urlopen(method, url, **extra_kw) 107s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 107s response = conn.urlopen(method, u.request_uri, **kw) 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 107s response = self._make_request( 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s 107s self = 107s conn = 107s method = 'GET', url = '/', body = None, headers = {} 107s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 107s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 107s chunked = False, response_conn = None, preload_content = True 107s decode_content = True, enforce_content_length = True 107s 107s def _make_request( 107s self, 107s conn: BaseHTTPConnection, 107s method: str, 107s url: str, 107s body: _TYPE_BODY | None = None, 107s headers: typing.Mapping[str, str] | None = None, 107s retries: Retry | None = None, 107s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 107s chunked: bool = False, 107s response_conn: BaseHTTPConnection | None = None, 107s preload_content: bool = True, 107s decode_content: bool = True, 107s enforce_content_length: bool = True, 107s ) -> BaseHTTPResponse: 107s """ 107s Perform a request on a given urllib connection object taken from our 107s pool. 107s 107s :param conn: 107s a connection from one of our connection pools 107s 107s :param method: 107s HTTP request method (such as GET, POST, PUT, etc.) 107s 107s :param url: 107s The URL to perform the request on. 107s 107s :param body: 107s Data to send in the request body, either :class:`str`, :class:`bytes`, 107s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 107s 107s :param headers: 107s Dictionary of custom headers to send, such as User-Agent, 107s If-None-Match, etc. If None, pool headers are used. If provided, 107s these headers completely replace any pool-specific headers. 107s 107s :param retries: 107s Configure the number of retries to allow before raising a 107s :class:`~urllib3.exceptions.MaxRetryError` exception. 107s 107s Pass ``None`` to retry until you receive a response. Pass a 107s :class:`~urllib3.util.retry.Retry` object for fine-grained control 107s over different types of retries. 107s Pass an integer number to retry connection errors that many times, 107s but no other types of errors. Pass zero to never retry. 107s 107s If ``False``, then retries are disabled and any exception is raised 107s immediately. Also, instead of raising a MaxRetryError on redirects, 107s the redirect response will be returned. 107s 107s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 107s 107s :param timeout: 107s If specified, overrides the default timeout for this one 107s request. It may be a float (in seconds) or an instance of 107s :class:`urllib3.util.Timeout`. 107s 107s :param chunked: 107s If True, urllib3 will send the body using chunked transfer 107s encoding. Otherwise, urllib3 will send the body using the standard 107s content-length form. Defaults to False. 107s 107s :param response_conn: 107s Set this to ``None`` if you will handle releasing the connection or 107s set the connection to have the response release it. 107s 107s :param preload_content: 107s If True, the response's body will be preloaded during construction. 107s 107s :param decode_content: 107s If True, will attempt to decode the body based on the 107s 'content-encoding' header. 107s 107s :param enforce_content_length: 107s Enforce content length checking. Body returned by server must match 107s value of Content-Length header, if present. Otherwise, raise error. 107s """ 107s self.num_requests += 1 107s 107s timeout_obj = self._get_timeout(timeout) 107s timeout_obj.start_connect() 107s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 107s 107s try: 107s # Trigger any extra validation we need to do. 107s try: 107s self._validate_conn(conn) 107s except (SocketTimeout, BaseSSLError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 107s raise 107s 107s # _validate_conn() starts the connection to an HTTPS proxy 107s # so we need to wrap errors with 'ProxyError' here too. 107s except ( 107s OSError, 107s NewConnectionError, 107s TimeoutError, 107s BaseSSLError, 107s CertificateError, 107s SSLError, 107s ) as e: 107s new_e: Exception = e 107s if isinstance(e, (BaseSSLError, CertificateError)): 107s new_e = SSLError(e) 107s # If the connection didn't successfully connect to it's proxy 107s # then there 107s if isinstance( 107s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 107s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 107s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 107s raise new_e 107s 107s # conn.request() calls http.client.*.request, not the method in 107s # urllib3.request. It also calls makefile (recv) on the socket. 107s try: 107s conn.request( 107s method, 107s url, 107s body=body, 107s headers=headers, 107s chunked=chunked, 107s preload_content=preload_content, 107s decode_content=decode_content, 107s enforce_content_length=enforce_content_length, 107s ) 107s 107s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 107s # legitimately able to close the connection after sending a valid response. 107s # With this behaviour, the received response is still readable. 107s except BrokenPipeError: 107s pass 107s except OSError as e: 107s # MacOS/Linux 107s # EPROTOTYPE and ECONNRESET are needed on macOS 107s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 107s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 107s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 107s raise 107s 107s # Reset the timeout for the recv() on the socket 107s read_timeout = timeout_obj.read_timeout 107s 107s if not conn.is_closed: 107s # In Python 3 socket.py will catch EAGAIN and return None when you 107s # try and read into the file pointer created by http.client, which 107s # instead raises a BadStatusLine exception. Instead of catching 107s # the exception and assuming all BadStatusLine exceptions are read 107s # timeouts, check for a zero timeout before making the request. 107s if read_timeout == 0: 107s raise ReadTimeoutError( 107s self, url, f"Read timed out. (read timeout={read_timeout})" 107s ) 107s conn.timeout = read_timeout 107s 107s # Receive the response from the server 107s try: 107s response = conn.getresponse() 107s except (BaseSSLError, OSError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 107s raise 107s 107s # Set properties that are used by the pooling layer. 107s response.retries = retries 107s response._connection = response_conn # type: ignore[attr-defined] 107s response._pool = self # type: ignore[attr-defined] 107s 107s log.debug( 107s '%s://%s:%s "%s %s %s" %s %s', 107s self.scheme, 107s self.host, 107s self.port, 107s method, 107s url, 107s > response.version_string, 107s response.status, 107s response.length_remaining, 107s ) 107s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 107s 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 107s ----------------------------- Captured stderr call ----------------------------- 107s 127.0.0.1 - - [19/Jan/2025 20:23:21] "GET / HTTP/1.1" 200 9358 107s ______________________________ test_headers[http] ______________________________ 107s 107s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_headers_http_0') 107s httpbin_both = 107s verify_pool_mgr = 107s 107s def test_headers(tmpdir, httpbin_both, verify_pool_mgr): 107s """Ensure that we can read the headers back""" 107s url = httpbin_both.url 107s with vcr.use_cassette(str(tmpdir.join("headers.yaml"))): 107s > headers = verify_pool_mgr.request("GET", url).headers 107s 107s tests/integration/test_urllib3.py:44: 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 107s return self.request_encode_url( 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 107s return self.urlopen(method, url, **extra_kw) 107s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 107s response = conn.urlopen(method, u.request_uri, **kw) 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 107s response = self._make_request( 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s 107s self = 107s conn = 107s method = 'GET', url = '/', body = None, headers = {} 107s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 107s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 107s chunked = False, response_conn = None, preload_content = True 107s decode_content = True, enforce_content_length = True 107s 107s def _make_request( 107s self, 107s conn: BaseHTTPConnection, 107s method: str, 107s url: str, 107s body: _TYPE_BODY | None = None, 107s headers: typing.Mapping[str, str] | None = None, 107s retries: Retry | None = None, 107s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 107s chunked: bool = False, 107s response_conn: BaseHTTPConnection | None = None, 107s preload_content: bool = True, 107s decode_content: bool = True, 107s enforce_content_length: bool = True, 107s ) -> BaseHTTPResponse: 107s """ 107s Perform a request on a given urllib connection object taken from our 107s pool. 107s 107s :param conn: 107s a connection from one of our connection pools 107s 107s :param method: 107s HTTP request method (such as GET, POST, PUT, etc.) 107s 107s :param url: 107s The URL to perform the request on. 107s 107s :param body: 107s Data to send in the request body, either :class:`str`, :class:`bytes`, 107s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 107s 107s :param headers: 107s Dictionary of custom headers to send, such as User-Agent, 107s If-None-Match, etc. If None, pool headers are used. If provided, 107s these headers completely replace any pool-specific headers. 107s 107s :param retries: 107s Configure the number of retries to allow before raising a 107s :class:`~urllib3.exceptions.MaxRetryError` exception. 107s 107s Pass ``None`` to retry until you receive a response. Pass a 107s :class:`~urllib3.util.retry.Retry` object for fine-grained control 107s over different types of retries. 107s Pass an integer number to retry connection errors that many times, 107s but no other types of errors. Pass zero to never retry. 107s 107s If ``False``, then retries are disabled and any exception is raised 107s immediately. Also, instead of raising a MaxRetryError on redirects, 107s the redirect response will be returned. 107s 107s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 107s 107s :param timeout: 107s If specified, overrides the default timeout for this one 107s request. It may be a float (in seconds) or an instance of 107s :class:`urllib3.util.Timeout`. 107s 107s :param chunked: 107s If True, urllib3 will send the body using chunked transfer 107s encoding. Otherwise, urllib3 will send the body using the standard 107s content-length form. Defaults to False. 107s 107s :param response_conn: 107s Set this to ``None`` if you will handle releasing the connection or 107s set the connection to have the response release it. 107s 107s :param preload_content: 107s If True, the response's body will be preloaded during construction. 107s 107s :param decode_content: 107s If True, will attempt to decode the body based on the 107s 'content-encoding' header. 107s 107s :param enforce_content_length: 107s Enforce content length checking. Body returned by server must match 107s value of Content-Length header, if present. Otherwise, raise error. 107s """ 107s self.num_requests += 1 107s 107s timeout_obj = self._get_timeout(timeout) 107s timeout_obj.start_connect() 107s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 107s 107s try: 107s # Trigger any extra validation we need to do. 107s try: 107s self._validate_conn(conn) 107s except (SocketTimeout, BaseSSLError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 107s raise 107s 107s # _validate_conn() starts the connection to an HTTPS proxy 107s # so we need to wrap errors with 'ProxyError' here too. 107s except ( 107s OSError, 107s NewConnectionError, 107s TimeoutError, 107s BaseSSLError, 107s CertificateError, 107s SSLError, 107s ) as e: 107s new_e: Exception = e 107s if isinstance(e, (BaseSSLError, CertificateError)): 107s new_e = SSLError(e) 107s # If the connection didn't successfully connect to it's proxy 107s # then there 107s if isinstance( 107s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 107s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 107s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 107s raise new_e 107s 107s # conn.request() calls http.client.*.request, not the method in 107s # urllib3.request. It also calls makefile (recv) on the socket. 107s try: 107s conn.request( 107s method, 107s url, 107s body=body, 107s headers=headers, 107s chunked=chunked, 107s preload_content=preload_content, 107s decode_content=decode_content, 107s enforce_content_length=enforce_content_length, 107s ) 107s 107s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 107s # legitimately able to close the connection after sending a valid response. 107s # With this behaviour, the received response is still readable. 107s except BrokenPipeError: 107s pass 107s except OSError as e: 107s # MacOS/Linux 107s # EPROTOTYPE and ECONNRESET are needed on macOS 107s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 107s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 107s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 107s raise 107s 107s # Reset the timeout for the recv() on the socket 107s read_timeout = timeout_obj.read_timeout 107s 107s if not conn.is_closed: 107s # In Python 3 socket.py will catch EAGAIN and return None when you 107s # try and read into the file pointer created by http.client, which 107s # instead raises a BadStatusLine exception. Instead of catching 107s # the exception and assuming all BadStatusLine exceptions are read 107s # timeouts, check for a zero timeout before making the request. 107s if read_timeout == 0: 107s raise ReadTimeoutError( 107s self, url, f"Read timed out. (read timeout={read_timeout})" 107s ) 107s conn.timeout = read_timeout 107s 107s # Receive the response from the server 107s try: 107s response = conn.getresponse() 107s except (BaseSSLError, OSError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 107s raise 107s 107s # Set properties that are used by the pooling layer. 107s response.retries = retries 107s response._connection = response_conn # type: ignore[attr-defined] 107s response._pool = self # type: ignore[attr-defined] 107s 107s log.debug( 107s '%s://%s:%s "%s %s %s" %s %s', 107s self.scheme, 107s self.host, 107s self.port, 107s method, 107s url, 107s > response.version_string, 107s response.status, 107s response.length_remaining, 107s ) 107s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 107s 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 107s ----------------------------- Captured stderr call ----------------------------- 107s 127.0.0.1 - - [19/Jan/2025 20:23:21] "GET / HTTP/1.1" 200 9358 107s _______________________________ test_body[http] ________________________________ 107s 107s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_body_http_0') 107s httpbin_both = 107s verify_pool_mgr = 107s 107s def test_body(tmpdir, httpbin_both, verify_pool_mgr): 107s """Ensure the responses are all identical enough""" 107s url = httpbin_both.url + "/bytes/1024" 107s with vcr.use_cassette(str(tmpdir.join("body.yaml"))): 107s > content = verify_pool_mgr.request("GET", url).data 107s 107s tests/integration/test_urllib3.py:55: 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 107s return self.request_encode_url( 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 107s return self.urlopen(method, url, **extra_kw) 107s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 107s response = conn.urlopen(method, u.request_uri, **kw) 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 107s response = self._make_request( 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s 107s self = 107s conn = 107s method = 'GET', url = '/bytes/1024', body = None, headers = {} 107s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 107s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 107s chunked = False, response_conn = None, preload_content = True 107s decode_content = True, enforce_content_length = True 107s 107s def _make_request( 107s self, 107s conn: BaseHTTPConnection, 107s method: str, 107s url: str, 107s body: _TYPE_BODY | None = None, 107s headers: typing.Mapping[str, str] | None = None, 107s retries: Retry | None = None, 107s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 107s chunked: bool = False, 107s response_conn: BaseHTTPConnection | None = None, 107s preload_content: bool = True, 107s decode_content: bool = True, 107s enforce_content_length: bool = True, 107s ) -> BaseHTTPResponse: 107s """ 107s Perform a request on a given urllib connection object taken from our 107s pool. 107s 107s :param conn: 107s a connection from one of our connection pools 107s 107s :param method: 107s HTTP request method (such as GET, POST, PUT, etc.) 107s 107s :param url: 107s The URL to perform the request on. 107s 107s :param body: 107s Data to send in the request body, either :class:`str`, :class:`bytes`, 107s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 107s 107s :param headers: 107s Dictionary of custom headers to send, such as User-Agent, 107s If-None-Match, etc. If None, pool headers are used. If provided, 107s these headers completely replace any pool-specific headers. 107s 107s :param retries: 107s Configure the number of retries to allow before raising a 107s :class:`~urllib3.exceptions.MaxRetryError` exception. 107s 107s Pass ``None`` to retry until you receive a response. Pass a 107s :class:`~urllib3.util.retry.Retry` object for fine-grained control 107s over different types of retries. 107s Pass an integer number to retry connection errors that many times, 107s but no other types of errors. Pass zero to never retry. 107s 107s If ``False``, then retries are disabled and any exception is raised 107s immediately. Also, instead of raising a MaxRetryError on redirects, 107s the redirect response will be returned. 107s 107s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 107s 107s :param timeout: 107s If specified, overrides the default timeout for this one 107s request. It may be a float (in seconds) or an instance of 107s :class:`urllib3.util.Timeout`. 107s 107s :param chunked: 107s If True, urllib3 will send the body using chunked transfer 107s encoding. Otherwise, urllib3 will send the body using the standard 107s content-length form. Defaults to False. 107s 107s :param response_conn: 107s Set this to ``None`` if you will handle releasing the connection or 107s set the connection to have the response release it. 107s 107s :param preload_content: 107s If True, the response's body will be preloaded during construction. 107s 107s :param decode_content: 107s If True, will attempt to decode the body based on the 107s 'content-encoding' header. 107s 107s :param enforce_content_length: 107s Enforce content length checking. Body returned by server must match 107s value of Content-Length header, if present. Otherwise, raise error. 107s """ 107s self.num_requests += 1 107s 107s timeout_obj = self._get_timeout(timeout) 107s timeout_obj.start_connect() 107s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 107s 107s try: 107s # Trigger any extra validation we need to do. 107s try: 107s self._validate_conn(conn) 107s except (SocketTimeout, BaseSSLError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 107s raise 107s 107s # _validate_conn() starts the connection to an HTTPS proxy 107s # so we need to wrap errors with 'ProxyError' here too. 107s except ( 107s OSError, 107s NewConnectionError, 107s TimeoutError, 107s BaseSSLError, 107s CertificateError, 107s SSLError, 107s ) as e: 107s new_e: Exception = e 107s if isinstance(e, (BaseSSLError, CertificateError)): 107s new_e = SSLError(e) 107s # If the connection didn't successfully connect to it's proxy 107s # then there 107s if isinstance( 107s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 107s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 107s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 107s raise new_e 107s 107s # conn.request() calls http.client.*.request, not the method in 107s # urllib3.request. It also calls makefile (recv) on the socket. 107s try: 107s conn.request( 107s method, 107s url, 107s body=body, 107s headers=headers, 107s chunked=chunked, 107s preload_content=preload_content, 107s decode_content=decode_content, 107s enforce_content_length=enforce_content_length, 107s ) 107s 107s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 107s # legitimately able to close the connection after sending a valid response. 107s # With this behaviour, the received response is still readable. 107s except BrokenPipeError: 107s pass 107s except OSError as e: 107s # MacOS/Linux 107s # EPROTOTYPE and ECONNRESET are needed on macOS 107s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 107s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 107s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 107s raise 107s 107s # Reset the timeout for the recv() on the socket 107s read_timeout = timeout_obj.read_timeout 107s 107s if not conn.is_closed: 107s # In Python 3 socket.py will catch EAGAIN and return None when you 107s # try and read into the file pointer created by http.client, which 107s # instead raises a BadStatusLine exception. Instead of catching 107s # the exception and assuming all BadStatusLine exceptions are read 107s # timeouts, check for a zero timeout before making the request. 107s if read_timeout == 0: 107s raise ReadTimeoutError( 107s self, url, f"Read timed out. (read timeout={read_timeout})" 107s ) 107s conn.timeout = read_timeout 107s 107s # Receive the response from the server 107s try: 107s response = conn.getresponse() 107s except (BaseSSLError, OSError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 107s raise 107s 107s # Set properties that are used by the pooling layer. 107s response.retries = retries 107s response._connection = response_conn # type: ignore[attr-defined] 107s response._pool = self # type: ignore[attr-defined] 107s 107s log.debug( 107s '%s://%s:%s "%s %s %s" %s %s', 107s self.scheme, 107s self.host, 107s self.port, 107s method, 107s url, 107s > response.version_string, 107s response.status, 107s response.length_remaining, 107s ) 107s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 107s 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 107s ----------------------------- Captured stderr call ----------------------------- 107s 127.0.0.1 - - [19/Jan/2025 20:23:21] "GET /bytes/1024 HTTP/1.1" 200 1024 107s _______________________________ test_auth[http] ________________________________ 107s 107s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_auth_http_0') 107s httpbin_both = 107s verify_pool_mgr = 107s 107s def test_auth(tmpdir, httpbin_both, verify_pool_mgr): 107s """Ensure that we can handle basic auth""" 107s auth = ("user", "passwd") 107s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 107s url = httpbin_both.url + "/basic-auth/user/passwd" 107s with vcr.use_cassette(str(tmpdir.join("auth.yaml"))): 107s > one = verify_pool_mgr.request("GET", url, headers=headers) 107s 107s tests/integration/test_urllib3.py:67: 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 107s return self.request_encode_url( 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 107s return self.urlopen(method, url, **extra_kw) 107s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 107s response = conn.urlopen(method, u.request_uri, **kw) 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 107s response = self._make_request( 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s 107s self = 107s conn = 107s method = 'GET', url = '/basic-auth/user/passwd', body = None 107s headers = {'authorization': 'Basic dXNlcjpwYXNzd2Q='} 107s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 107s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 107s chunked = False, response_conn = None, preload_content = True 107s decode_content = True, enforce_content_length = True 107s 107s def _make_request( 107s self, 107s conn: BaseHTTPConnection, 107s method: str, 107s url: str, 107s body: _TYPE_BODY | None = None, 107s headers: typing.Mapping[str, str] | None = None, 107s retries: Retry | None = None, 107s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 107s chunked: bool = False, 107s response_conn: BaseHTTPConnection | None = None, 107s preload_content: bool = True, 107s decode_content: bool = True, 107s enforce_content_length: bool = True, 107s ) -> BaseHTTPResponse: 107s """ 107s Perform a request on a given urllib connection object taken from our 107s pool. 107s 107s :param conn: 107s a connection from one of our connection pools 107s 107s :param method: 107s HTTP request method (such as GET, POST, PUT, etc.) 107s 107s :param url: 107s The URL to perform the request on. 107s 107s :param body: 107s Data to send in the request body, either :class:`str`, :class:`bytes`, 107s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 107s 107s :param headers: 107s Dictionary of custom headers to send, such as User-Agent, 107s If-None-Match, etc. If None, pool headers are used. If provided, 107s these headers completely replace any pool-specific headers. 107s 107s :param retries: 107s Configure the number of retries to allow before raising a 107s :class:`~urllib3.exceptions.MaxRetryError` exception. 107s 107s Pass ``None`` to retry until you receive a response. Pass a 107s :class:`~urllib3.util.retry.Retry` object for fine-grained control 107s over different types of retries. 107s Pass an integer number to retry connection errors that many times, 107s but no other types of errors. Pass zero to never retry. 107s 107s If ``False``, then retries are disabled and any exception is raised 107s immediately. Also, instead of raising a MaxRetryError on redirects, 107s the redirect response will be returned. 107s 107s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 107s 107s :param timeout: 107s If specified, overrides the default timeout for this one 107s request. It may be a float (in seconds) or an instance of 107s :class:`urllib3.util.Timeout`. 107s 107s :param chunked: 107s If True, urllib3 will send the body using chunked transfer 107s encoding. Otherwise, urllib3 will send the body using the standard 107s content-length form. Defaults to False. 107s 107s :param response_conn: 107s Set this to ``None`` if you will handle releasing the connection or 107s set the connection to have the response release it. 107s 107s :param preload_content: 107s If True, the response's body will be preloaded during construction. 107s 107s :param decode_content: 107s If True, will attempt to decode the body based on the 107s 'content-encoding' header. 107s 107s :param enforce_content_length: 107s Enforce content length checking. Body returned by server must match 107s value of Content-Length header, if present. Otherwise, raise error. 107s """ 107s self.num_requests += 1 107s 107s timeout_obj = self._get_timeout(timeout) 107s timeout_obj.start_connect() 107s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 107s 107s try: 107s # Trigger any extra validation we need to do. 107s try: 107s self._validate_conn(conn) 107s except (SocketTimeout, BaseSSLError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 107s raise 107s 107s # _validate_conn() starts the connection to an HTTPS proxy 107s # so we need to wrap errors with 'ProxyError' here too. 107s except ( 107s OSError, 107s NewConnectionError, 107s TimeoutError, 107s BaseSSLError, 107s CertificateError, 107s SSLError, 107s ) as e: 107s new_e: Exception = e 107s if isinstance(e, (BaseSSLError, CertificateError)): 107s new_e = SSLError(e) 107s # If the connection didn't successfully connect to it's proxy 107s # then there 107s if isinstance( 107s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 107s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 107s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 107s raise new_e 107s 107s # conn.request() calls http.client.*.request, not the method in 107s # urllib3.request. It also calls makefile (recv) on the socket. 107s try: 107s conn.request( 107s method, 107s url, 107s body=body, 107s headers=headers, 107s chunked=chunked, 107s preload_content=preload_content, 107s decode_content=decode_content, 107s enforce_content_length=enforce_content_length, 107s ) 107s 107s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 107s # legitimately able to close the connection after sending a valid response. 107s # With this behaviour, the received response is still readable. 107s except BrokenPipeError: 107s pass 107s except OSError as e: 107s # MacOS/Linux 107s # EPROTOTYPE and ECONNRESET are needed on macOS 107s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 107s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 107s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 107s raise 107s 107s # Reset the timeout for the recv() on the socket 107s read_timeout = timeout_obj.read_timeout 107s 107s if not conn.is_closed: 107s # In Python 3 socket.py will catch EAGAIN and return None when you 107s # try and read into the file pointer created by http.client, which 107s # instead raises a BadStatusLine exception. Instead of catching 107s # the exception and assuming all BadStatusLine exceptions are read 107s # timeouts, check for a zero timeout before making the request. 107s if read_timeout == 0: 107s raise ReadTimeoutError( 107s self, url, f"Read timed out. (read timeout={read_timeout})" 107s ) 107s conn.timeout = read_timeout 107s 107s # Receive the response from the server 107s try: 107s response = conn.getresponse() 107s except (BaseSSLError, OSError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 107s raise 107s 107s # Set properties that are used by the pooling layer. 107s response.retries = retries 107s response._connection = response_conn # type: ignore[attr-defined] 107s response._pool = self # type: ignore[attr-defined] 107s 107s log.debug( 107s '%s://%s:%s "%s %s %s" %s %s', 107s self.scheme, 107s self.host, 107s self.port, 107s method, 107s url, 107s > response.version_string, 107s response.status, 107s response.length_remaining, 107s ) 107s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 107s 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 107s ----------------------------- Captured stderr call ----------------------------- 107s 127.0.0.1 - - [19/Jan/2025 20:23:21] "GET /basic-auth/user/passwd HTTP/1.1" 200 46 107s ____________________________ test_auth_failed[http] ____________________________ 107s 107s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_auth_failed_http_0') 107s httpbin_both = 107s verify_pool_mgr = 107s 107s def test_auth_failed(tmpdir, httpbin_both, verify_pool_mgr): 107s """Ensure that we can save failed auth statuses""" 107s auth = ("user", "wrongwrongwrong") 107s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 107s url = httpbin_both.url + "/basic-auth/user/passwd" 107s with vcr.use_cassette(str(tmpdir.join("auth-failed.yaml"))) as cass: 107s # Ensure that this is empty to begin with 107s assert_cassette_empty(cass) 107s > one = verify_pool_mgr.request("GET", url, headers=headers) 107s 107s tests/integration/test_urllib3.py:83: 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 107s return self.request_encode_url( 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 107s return self.urlopen(method, url, **extra_kw) 107s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 107s response = conn.urlopen(method, u.request_uri, **kw) 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 107s response = self._make_request( 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s 107s self = 107s conn = 107s method = 'GET', url = '/basic-auth/user/passwd', body = None 107s headers = {'authorization': 'Basic dXNlcjp3cm9uZ3dyb25nd3Jvbmc='} 107s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 107s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 107s chunked = False, response_conn = None, preload_content = True 107s decode_content = True, enforce_content_length = True 107s 107s def _make_request( 107s self, 107s conn: BaseHTTPConnection, 107s method: str, 107s url: str, 107s body: _TYPE_BODY | None = None, 107s headers: typing.Mapping[str, str] | None = None, 107s retries: Retry | None = None, 107s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 107s chunked: bool = False, 107s response_conn: BaseHTTPConnection | None = None, 107s preload_content: bool = True, 107s decode_content: bool = True, 107s enforce_content_length: bool = True, 107s ) -> BaseHTTPResponse: 107s """ 107s Perform a request on a given urllib connection object taken from our 107s pool. 107s 107s :param conn: 107s a connection from one of our connection pools 107s 107s :param method: 107s HTTP request method (such as GET, POST, PUT, etc.) 107s 107s :param url: 107s The URL to perform the request on. 107s 107s :param body: 107s Data to send in the request body, either :class:`str`, :class:`bytes`, 107s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 107s 107s :param headers: 107s Dictionary of custom headers to send, such as User-Agent, 107s If-None-Match, etc. If None, pool headers are used. If provided, 107s these headers completely replace any pool-specific headers. 107s 107s :param retries: 107s Configure the number of retries to allow before raising a 107s :class:`~urllib3.exceptions.MaxRetryError` exception. 107s 107s Pass ``None`` to retry until you receive a response. Pass a 107s :class:`~urllib3.util.retry.Retry` object for fine-grained control 107s over different types of retries. 107s Pass an integer number to retry connection errors that many times, 107s but no other types of errors. Pass zero to never retry. 107s 107s If ``False``, then retries are disabled and any exception is raised 107s immediately. Also, instead of raising a MaxRetryError on redirects, 107s the redirect response will be returned. 107s 107s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 107s 107s :param timeout: 107s If specified, overrides the default timeout for this one 107s request. It may be a float (in seconds) or an instance of 107s :class:`urllib3.util.Timeout`. 107s 107s :param chunked: 107s If True, urllib3 will send the body using chunked transfer 107s encoding. Otherwise, urllib3 will send the body using the standard 107s content-length form. Defaults to False. 107s 107s :param response_conn: 107s Set this to ``None`` if you will handle releasing the connection or 107s set the connection to have the response release it. 107s 107s :param preload_content: 107s If True, the response's body will be preloaded during construction. 107s 107s :param decode_content: 107s If True, will attempt to decode the body based on the 107s 'content-encoding' header. 107s 107s :param enforce_content_length: 107s Enforce content length checking. Body returned by server must match 107s value of Content-Length header, if present. Otherwise, raise error. 107s """ 107s self.num_requests += 1 107s 107s timeout_obj = self._get_timeout(timeout) 107s timeout_obj.start_connect() 107s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 107s 107s try: 107s # Trigger any extra validation we need to do. 107s try: 107s self._validate_conn(conn) 107s except (SocketTimeout, BaseSSLError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 107s raise 107s 107s # _validate_conn() starts the connection to an HTTPS proxy 107s # so we need to wrap errors with 'ProxyError' here too. 107s except ( 107s OSError, 107s NewConnectionError, 107s TimeoutError, 107s BaseSSLError, 107s CertificateError, 107s SSLError, 107s ) as e: 107s new_e: Exception = e 107s if isinstance(e, (BaseSSLError, CertificateError)): 107s new_e = SSLError(e) 107s # If the connection didn't successfully connect to it's proxy 107s # then there 107s if isinstance( 107s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 107s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 107s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 107s raise new_e 107s 107s # conn.request() calls http.client.*.request, not the method in 107s # urllib3.request. It also calls makefile (recv) on the socket. 107s try: 107s conn.request( 107s method, 107s url, 107s body=body, 107s headers=headers, 107s chunked=chunked, 107s preload_content=preload_content, 107s decode_content=decode_content, 107s enforce_content_length=enforce_content_length, 107s ) 107s 107s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 107s # legitimately able to close the connection after sending a valid response. 107s # With this behaviour, the received response is still readable. 107s except BrokenPipeError: 107s pass 107s except OSError as e: 107s # MacOS/Linux 107s # EPROTOTYPE and ECONNRESET are needed on macOS 107s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 107s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 107s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 107s raise 107s 107s # Reset the timeout for the recv() on the socket 107s read_timeout = timeout_obj.read_timeout 107s 107s if not conn.is_closed: 107s # In Python 3 socket.py will catch EAGAIN and return None when you 107s # try and read into the file pointer created by http.client, which 107s # instead raises a BadStatusLine exception. Instead of catching 107s # the exception and assuming all BadStatusLine exceptions are read 107s # timeouts, check for a zero timeout before making the request. 107s if read_timeout == 0: 107s raise ReadTimeoutError( 107s self, url, f"Read timed out. (read timeout={read_timeout})" 107s ) 107s conn.timeout = read_timeout 107s 107s # Receive the response from the server 107s try: 107s response = conn.getresponse() 107s except (BaseSSLError, OSError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 107s raise 107s 107s # Set properties that are used by the pooling layer. 107s response.retries = retries 107s response._connection = response_conn # type: ignore[attr-defined] 107s response._pool = self # type: ignore[attr-defined] 107s 107s log.debug( 107s '%s://%s:%s "%s %s %s" %s %s', 107s self.scheme, 107s self.host, 107s self.port, 107s method, 107s url, 107s > response.version_string, 107s response.status, 107s response.length_remaining, 107s ) 107s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 107s 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 107s ----------------------------- Captured stderr call ----------------------------- 107s 127.0.0.1 - - [19/Jan/2025 20:23:21] "GET /basic-auth/user/passwd HTTP/1.1" 401 0 107s _______________________________ test_post[http] ________________________________ 107s 107s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_post_http_0') 107s httpbin_both = 107s verify_pool_mgr = 107s 107s def test_post(tmpdir, httpbin_both, verify_pool_mgr): 107s """Ensure that we can post and cache the results""" 107s data = {"key1": "value1", "key2": "value2"} 107s url = httpbin_both.url + "/post" 107s with vcr.use_cassette(str(tmpdir.join("verify_pool_mgr.yaml"))): 107s > req1 = verify_pool_mgr.request("POST", url, data).data 107s 107s tests/integration/test_urllib3.py:94: 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:143: in request 107s return self.request_encode_body( 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:278: in request_encode_body 107s return self.urlopen(method, url, **extra_kw) 107s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 107s response = conn.urlopen(method, u.request_uri, **kw) 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 107s response = self._make_request( 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s 107s self = 107s conn = 107s method = 'POST', url = '/post', body = {'key1': 'value1', 'key2': 'value2'} 107s headers = HTTPHeaderDict({}) 107s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 107s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 107s chunked = False, response_conn = None, preload_content = True 107s decode_content = True, enforce_content_length = True 107s 107s def _make_request( 107s self, 107s conn: BaseHTTPConnection, 107s method: str, 107s url: str, 107s body: _TYPE_BODY | None = None, 107s headers: typing.Mapping[str, str] | None = None, 107s retries: Retry | None = None, 107s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 107s chunked: bool = False, 107s response_conn: BaseHTTPConnection | None = None, 107s preload_content: bool = True, 107s decode_content: bool = True, 107s enforce_content_length: bool = True, 107s ) -> BaseHTTPResponse: 107s """ 107s Perform a request on a given urllib connection object taken from our 107s pool. 107s 107s :param conn: 107s a connection from one of our connection pools 107s 107s :param method: 107s HTTP request method (such as GET, POST, PUT, etc.) 107s 107s :param url: 107s The URL to perform the request on. 107s 107s :param body: 107s Data to send in the request body, either :class:`str`, :class:`bytes`, 107s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 107s 107s :param headers: 107s Dictionary of custom headers to send, such as User-Agent, 107s If-None-Match, etc. If None, pool headers are used. If provided, 107s these headers completely replace any pool-specific headers. 107s 107s :param retries: 107s Configure the number of retries to allow before raising a 107s :class:`~urllib3.exceptions.MaxRetryError` exception. 107s 107s Pass ``None`` to retry until you receive a response. Pass a 107s :class:`~urllib3.util.retry.Retry` object for fine-grained control 107s over different types of retries. 107s Pass an integer number to retry connection errors that many times, 107s but no other types of errors. Pass zero to never retry. 107s 107s If ``False``, then retries are disabled and any exception is raised 107s immediately. Also, instead of raising a MaxRetryError on redirects, 107s the redirect response will be returned. 107s 107s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 107s 107s :param timeout: 107s If specified, overrides the default timeout for this one 107s request. It may be a float (in seconds) or an instance of 107s :class:`urllib3.util.Timeout`. 107s 107s :param chunked: 107s If True, urllib3 will send the body using chunked transfer 107s encoding. Otherwise, urllib3 will send the body using the standard 107s content-length form. Defaults to False. 107s 107s :param response_conn: 107s Set this to ``None`` if you will handle releasing the connection or 107s set the connection to have the response release it. 107s 107s :param preload_content: 107s If True, the response's body will be preloaded during construction. 107s 107s :param decode_content: 107s If True, will attempt to decode the body based on the 107s 'content-encoding' header. 107s 107s :param enforce_content_length: 107s Enforce content length checking. Body returned by server must match 107s value of Content-Length header, if present. Otherwise, raise error. 107s """ 107s self.num_requests += 1 107s 107s timeout_obj = self._get_timeout(timeout) 107s timeout_obj.start_connect() 107s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 107s 107s try: 107s # Trigger any extra validation we need to do. 107s try: 107s self._validate_conn(conn) 107s except (SocketTimeout, BaseSSLError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 107s raise 107s 107s # _validate_conn() starts the connection to an HTTPS proxy 107s # so we need to wrap errors with 'ProxyError' here too. 107s except ( 107s OSError, 107s NewConnectionError, 107s TimeoutError, 107s BaseSSLError, 107s CertificateError, 107s SSLError, 107s ) as e: 107s new_e: Exception = e 107s if isinstance(e, (BaseSSLError, CertificateError)): 107s new_e = SSLError(e) 107s # If the connection didn't successfully connect to it's proxy 107s # then there 107s if isinstance( 107s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 107s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 107s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 107s raise new_e 107s 107s # conn.request() calls http.client.*.request, not the method in 107s # urllib3.request. It also calls makefile (recv) on the socket. 107s try: 107s conn.request( 107s method, 107s url, 107s body=body, 107s headers=headers, 107s chunked=chunked, 107s preload_content=preload_content, 107s decode_content=decode_content, 107s enforce_content_length=enforce_content_length, 107s ) 107s 107s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 107s # legitimately able to close the connection after sending a valid response. 107s # With this behaviour, the received response is still readable. 107s except BrokenPipeError: 107s pass 107s except OSError as e: 107s # MacOS/Linux 107s # EPROTOTYPE and ECONNRESET are needed on macOS 107s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 107s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 107s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 107s raise 107s 107s # Reset the timeout for the recv() on the socket 107s read_timeout = timeout_obj.read_timeout 107s 107s if not conn.is_closed: 107s # In Python 3 socket.py will catch EAGAIN and return None when you 107s # try and read into the file pointer created by http.client, which 107s # instead raises a BadStatusLine exception. Instead of catching 107s # the exception and assuming all BadStatusLine exceptions are read 107s # timeouts, check for a zero timeout before making the request. 107s if read_timeout == 0: 107s raise ReadTimeoutError( 107s self, url, f"Read timed out. (read timeout={read_timeout})" 107s ) 107s conn.timeout = read_timeout 107s 107s # Receive the response from the server 107s try: 107s response = conn.getresponse() 107s except (BaseSSLError, OSError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 107s raise 107s 107s # Set properties that are used by the pooling layer. 107s response.retries = retries 107s response._connection = response_conn # type: ignore[attr-defined] 107s response._pool = self # type: ignore[attr-defined] 107s 107s log.debug( 107s '%s://%s:%s "%s %s %s" %s %s', 107s self.scheme, 107s self.host, 107s self.port, 107s method, 107s url, 107s > response.version_string, 107s response.status, 107s response.length_remaining, 107s ) 107s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 107s 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 107s ----------------------------- Captured stderr call ----------------------------- 107s 127.0.0.1 - - [19/Jan/2025 20:23:21] "POST /post HTTP/1.1" 501 159 107s _______________________________ test_gzip[http] ________________________________ 107s 107s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_gzip_http_0') 107s httpbin_both = 107s verify_pool_mgr = 107s 107s def test_gzip(tmpdir, httpbin_both, verify_pool_mgr): 107s """ 107s Ensure that requests (actually urllib3) is able to automatically decompress 107s the response body 107s """ 107s url = httpbin_both.url + "/gzip" 107s response = verify_pool_mgr.request("GET", url) 107s 107s with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))): 107s > response = verify_pool_mgr.request("GET", url) 107s 107s tests/integration/test_urllib3.py:140: 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 107s return self.request_encode_url( 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 107s return self.urlopen(method, url, **extra_kw) 107s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 107s response = conn.urlopen(method, u.request_uri, **kw) 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 107s response = self._make_request( 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s 107s self = 107s conn = 107s method = 'GET', url = '/gzip', body = None, headers = {} 107s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 107s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 107s chunked = False, response_conn = None, preload_content = True 107s decode_content = True, enforce_content_length = True 107s 107s def _make_request( 107s self, 107s conn: BaseHTTPConnection, 107s method: str, 107s url: str, 107s body: _TYPE_BODY | None = None, 107s headers: typing.Mapping[str, str] | None = None, 107s retries: Retry | None = None, 107s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 107s chunked: bool = False, 107s response_conn: BaseHTTPConnection | None = None, 107s preload_content: bool = True, 107s decode_content: bool = True, 107s enforce_content_length: bool = True, 107s ) -> BaseHTTPResponse: 107s """ 107s Perform a request on a given urllib connection object taken from our 107s pool. 107s 107s :param conn: 107s a connection from one of our connection pools 107s 107s :param method: 107s HTTP request method (such as GET, POST, PUT, etc.) 107s 107s :param url: 107s The URL to perform the request on. 107s 107s :param body: 107s Data to send in the request body, either :class:`str`, :class:`bytes`, 107s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 107s 107s :param headers: 107s Dictionary of custom headers to send, such as User-Agent, 107s If-None-Match, etc. If None, pool headers are used. If provided, 107s these headers completely replace any pool-specific headers. 107s 107s :param retries: 107s Configure the number of retries to allow before raising a 107s :class:`~urllib3.exceptions.MaxRetryError` exception. 107s 107s Pass ``None`` to retry until you receive a response. Pass a 107s :class:`~urllib3.util.retry.Retry` object for fine-grained control 107s over different types of retries. 107s Pass an integer number to retry connection errors that many times, 107s but no other types of errors. Pass zero to never retry. 107s 107s If ``False``, then retries are disabled and any exception is raised 107s immediately. Also, instead of raising a MaxRetryError on redirects, 107s the redirect response will be returned. 107s 107s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 107s 107s :param timeout: 107s If specified, overrides the default timeout for this one 107s request. It may be a float (in seconds) or an instance of 107s :class:`urllib3.util.Timeout`. 107s 107s :param chunked: 107s If True, urllib3 will send the body using chunked transfer 107s encoding. Otherwise, urllib3 will send the body using the standard 107s content-length form. Defaults to False. 107s 107s :param response_conn: 107s Set this to ``None`` if you will handle releasing the connection or 107s set the connection to have the response release it. 107s 107s :param preload_content: 107s If True, the response's body will be preloaded during construction. 107s 107s :param decode_content: 107s If True, will attempt to decode the body based on the 107s 'content-encoding' header. 107s 107s :param enforce_content_length: 107s Enforce content length checking. Body returned by server must match 107s value of Content-Length header, if present. Otherwise, raise error. 107s """ 107s self.num_requests += 1 107s 107s timeout_obj = self._get_timeout(timeout) 107s timeout_obj.start_connect() 107s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 107s 107s try: 107s # Trigger any extra validation we need to do. 107s try: 107s self._validate_conn(conn) 107s except (SocketTimeout, BaseSSLError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 107s raise 107s 107s # _validate_conn() starts the connection to an HTTPS proxy 107s # so we need to wrap errors with 'ProxyError' here too. 107s except ( 107s OSError, 107s NewConnectionError, 107s TimeoutError, 107s BaseSSLError, 107s CertificateError, 107s SSLError, 107s ) as e: 107s new_e: Exception = e 107s if isinstance(e, (BaseSSLError, CertificateError)): 107s new_e = SSLError(e) 107s # If the connection didn't successfully connect to it's proxy 107s # then there 107s if isinstance( 107s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 107s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 107s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 107s raise new_e 107s 107s # conn.request() calls http.client.*.request, not the method in 107s # urllib3.request. It also calls makefile (recv) on the socket. 107s try: 107s conn.request( 107s method, 107s url, 107s body=body, 107s headers=headers, 107s chunked=chunked, 107s preload_content=preload_content, 107s decode_content=decode_content, 107s enforce_content_length=enforce_content_length, 107s ) 107s 107s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 107s # legitimately able to close the connection after sending a valid response. 107s # With this behaviour, the received response is still readable. 107s except BrokenPipeError: 107s pass 107s except OSError as e: 107s # MacOS/Linux 107s # EPROTOTYPE and ECONNRESET are needed on macOS 107s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 107s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 107s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 107s raise 107s 107s # Reset the timeout for the recv() on the socket 107s read_timeout = timeout_obj.read_timeout 107s 107s if not conn.is_closed: 107s # In Python 3 socket.py will catch EAGAIN and return None when you 107s # try and read into the file pointer created by http.client, which 107s # instead raises a BadStatusLine exception. Instead of catching 107s # the exception and assuming all BadStatusLine exceptions are read 107s # timeouts, check for a zero timeout before making the request. 107s if read_timeout == 0: 107s raise ReadTimeoutError( 107s self, url, f"Read timed out. (read timeout={read_timeout})" 107s ) 107s conn.timeout = read_timeout 107s 107s # Receive the response from the server 107s try: 107s response = conn.getresponse() 107s except (BaseSSLError, OSError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 107s raise 107s 107s # Set properties that are used by the pooling layer. 107s response.retries = retries 107s response._connection = response_conn # type: ignore[attr-defined] 107s response._pool = self # type: ignore[attr-defined] 107s 107s log.debug( 107s '%s://%s:%s "%s %s %s" %s %s', 107s self.scheme, 107s self.host, 107s self.port, 107s method, 107s url, 107s > response.version_string, 107s response.status, 107s response.length_remaining, 107s ) 107s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 107s 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 107s ----------------------------- Captured stderr call ----------------------------- 107s 127.0.0.1 - - [19/Jan/2025 20:23:21] "GET /gzip HTTP/1.1" 200 165 107s 127.0.0.1 - - [19/Jan/2025 20:23:21] "GET /gzip HTTP/1.1" 200 165 107s ___________________________ test_status_code[https] ____________________________ 107s 107s httpbin_both = 107s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_status_code_https_0') 107s verify_pool_mgr = 107s 107s def test_status_code(httpbin_both, tmpdir, verify_pool_mgr): 107s """Ensure that we can read the status code""" 107s url = httpbin_both.url 107s with vcr.use_cassette(str(tmpdir.join("atts.yaml"))): 107s > status_code = verify_pool_mgr.request("GET", url).status 107s 107s tests/integration/test_urllib3.py:34: 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 107s return self.request_encode_url( 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 107s return self.urlopen(method, url, **extra_kw) 107s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 107s response = conn.urlopen(method, u.request_uri, **kw) 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 107s response = self._make_request( 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s 107s self = 107s conn = 107s method = 'GET', url = '/', body = None, headers = {} 107s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 107s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 107s chunked = False, response_conn = None, preload_content = True 107s decode_content = True, enforce_content_length = True 107s 107s def _make_request( 107s self, 107s conn: BaseHTTPConnection, 107s method: str, 107s url: str, 107s body: _TYPE_BODY | None = None, 107s headers: typing.Mapping[str, str] | None = None, 107s retries: Retry | None = None, 107s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 107s chunked: bool = False, 107s response_conn: BaseHTTPConnection | None = None, 107s preload_content: bool = True, 107s decode_content: bool = True, 107s enforce_content_length: bool = True, 107s ) -> BaseHTTPResponse: 107s """ 107s Perform a request on a given urllib connection object taken from our 107s pool. 107s 107s :param conn: 107s a connection from one of our connection pools 107s 107s :param method: 107s HTTP request method (such as GET, POST, PUT, etc.) 107s 107s :param url: 107s The URL to perform the request on. 107s 107s :param body: 107s Data to send in the request body, either :class:`str`, :class:`bytes`, 107s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 107s 107s :param headers: 107s Dictionary of custom headers to send, such as User-Agent, 107s If-None-Match, etc. If None, pool headers are used. If provided, 107s these headers completely replace any pool-specific headers. 107s 107s :param retries: 107s Configure the number of retries to allow before raising a 107s :class:`~urllib3.exceptions.MaxRetryError` exception. 107s 107s Pass ``None`` to retry until you receive a response. Pass a 107s :class:`~urllib3.util.retry.Retry` object for fine-grained control 107s over different types of retries. 107s Pass an integer number to retry connection errors that many times, 107s but no other types of errors. Pass zero to never retry. 107s 107s If ``False``, then retries are disabled and any exception is raised 107s immediately. Also, instead of raising a MaxRetryError on redirects, 107s the redirect response will be returned. 107s 107s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 107s 107s :param timeout: 107s If specified, overrides the default timeout for this one 107s request. It may be a float (in seconds) or an instance of 107s :class:`urllib3.util.Timeout`. 107s 107s :param chunked: 107s If True, urllib3 will send the body using chunked transfer 107s encoding. Otherwise, urllib3 will send the body using the standard 107s content-length form. Defaults to False. 107s 107s :param response_conn: 107s Set this to ``None`` if you will handle releasing the connection or 107s set the connection to have the response release it. 107s 107s :param preload_content: 107s If True, the response's body will be preloaded during construction. 107s 107s :param decode_content: 107s If True, will attempt to decode the body based on the 107s 'content-encoding' header. 107s 107s :param enforce_content_length: 107s Enforce content length checking. Body returned by server must match 107s value of Content-Length header, if present. Otherwise, raise error. 107s """ 107s self.num_requests += 1 107s 107s timeout_obj = self._get_timeout(timeout) 107s timeout_obj.start_connect() 107s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 107s 107s try: 107s # Trigger any extra validation we need to do. 107s try: 107s self._validate_conn(conn) 107s except (SocketTimeout, BaseSSLError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 107s raise 107s 107s # _validate_conn() starts the connection to an HTTPS proxy 107s # so we need to wrap errors with 'ProxyError' here too. 107s except ( 107s OSError, 107s NewConnectionError, 107s TimeoutError, 107s BaseSSLError, 107s CertificateError, 107s SSLError, 107s ) as e: 107s new_e: Exception = e 107s if isinstance(e, (BaseSSLError, CertificateError)): 107s new_e = SSLError(e) 107s # If the connection didn't successfully connect to it's proxy 107s # then there 107s if isinstance( 107s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 107s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 107s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 107s raise new_e 107s 107s # conn.request() calls http.client.*.request, not the method in 107s # urllib3.request. It also calls makefile (recv) on the socket. 107s try: 107s conn.request( 107s method, 107s url, 107s body=body, 107s headers=headers, 107s chunked=chunked, 107s preload_content=preload_content, 107s decode_content=decode_content, 107s enforce_content_length=enforce_content_length, 107s ) 107s 107s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 107s # legitimately able to close the connection after sending a valid response. 107s # With this behaviour, the received response is still readable. 107s except BrokenPipeError: 107s pass 107s except OSError as e: 107s # MacOS/Linux 107s # EPROTOTYPE and ECONNRESET are needed on macOS 107s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 107s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 107s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 107s raise 107s 107s # Reset the timeout for the recv() on the socket 107s read_timeout = timeout_obj.read_timeout 107s 107s if not conn.is_closed: 107s # In Python 3 socket.py will catch EAGAIN and return None when you 107s # try and read into the file pointer created by http.client, which 107s # instead raises a BadStatusLine exception. Instead of catching 107s # the exception and assuming all BadStatusLine exceptions are read 107s # timeouts, check for a zero timeout before making the request. 107s if read_timeout == 0: 107s raise ReadTimeoutError( 107s self, url, f"Read timed out. (read timeout={read_timeout})" 107s ) 107s conn.timeout = read_timeout 107s 107s # Receive the response from the server 107s try: 107s response = conn.getresponse() 107s except (BaseSSLError, OSError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 107s raise 107s 107s # Set properties that are used by the pooling layer. 107s response.retries = retries 107s response._connection = response_conn # type: ignore[attr-defined] 107s response._pool = self # type: ignore[attr-defined] 107s 107s log.debug( 107s '%s://%s:%s "%s %s %s" %s %s', 107s self.scheme, 107s self.host, 107s self.port, 107s method, 107s url, 107s > response.version_string, 107s response.status, 107s response.length_remaining, 107s ) 107s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 107s 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 107s ----------------------------- Captured stderr call ----------------------------- 107s 127.0.0.1 - - [19/Jan/2025 20:23:21] "GET / HTTP/1.1" 200 9358 107s _____________________________ test_headers[https] ______________________________ 107s 107s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_headers_https_0') 107s httpbin_both = 107s verify_pool_mgr = 107s 107s def test_headers(tmpdir, httpbin_both, verify_pool_mgr): 107s """Ensure that we can read the headers back""" 107s url = httpbin_both.url 107s with vcr.use_cassette(str(tmpdir.join("headers.yaml"))): 107s > headers = verify_pool_mgr.request("GET", url).headers 107s 107s tests/integration/test_urllib3.py:44: 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 107s return self.request_encode_url( 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 107s return self.urlopen(method, url, **extra_kw) 107s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 107s response = conn.urlopen(method, u.request_uri, **kw) 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 107s response = self._make_request( 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s 107s self = 107s conn = 107s method = 'GET', url = '/', body = None, headers = {} 107s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 107s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 107s chunked = False, response_conn = None, preload_content = True 107s decode_content = True, enforce_content_length = True 107s 107s def _make_request( 107s self, 107s conn: BaseHTTPConnection, 107s method: str, 107s url: str, 107s body: _TYPE_BODY | None = None, 107s headers: typing.Mapping[str, str] | None = None, 107s retries: Retry | None = None, 107s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 107s chunked: bool = False, 107s response_conn: BaseHTTPConnection | None = None, 107s preload_content: bool = True, 107s decode_content: bool = True, 107s enforce_content_length: bool = True, 107s ) -> BaseHTTPResponse: 107s """ 107s Perform a request on a given urllib connection object taken from our 107s pool. 107s 107s :param conn: 107s a connection from one of our connection pools 107s 107s :param method: 107s HTTP request method (such as GET, POST, PUT, etc.) 107s 107s :param url: 107s The URL to perform the request on. 107s 107s :param body: 107s Data to send in the request body, either :class:`str`, :class:`bytes`, 107s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 107s 107s :param headers: 107s Dictionary of custom headers to send, such as User-Agent, 107s If-None-Match, etc. If None, pool headers are used. If provided, 107s these headers completely replace any pool-specific headers. 107s 107s :param retries: 107s Configure the number of retries to allow before raising a 107s :class:`~urllib3.exceptions.MaxRetryError` exception. 107s 107s Pass ``None`` to retry until you receive a response. Pass a 107s :class:`~urllib3.util.retry.Retry` object for fine-grained control 107s over different types of retries. 107s Pass an integer number to retry connection errors that many times, 107s but no other types of errors. Pass zero to never retry. 107s 107s If ``False``, then retries are disabled and any exception is raised 107s immediately. Also, instead of raising a MaxRetryError on redirects, 107s the redirect response will be returned. 107s 107s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 107s 107s :param timeout: 107s If specified, overrides the default timeout for this one 107s request. It may be a float (in seconds) or an instance of 107s :class:`urllib3.util.Timeout`. 107s 107s :param chunked: 107s If True, urllib3 will send the body using chunked transfer 107s encoding. Otherwise, urllib3 will send the body using the standard 107s content-length form. Defaults to False. 107s 107s :param response_conn: 107s Set this to ``None`` if you will handle releasing the connection or 107s set the connection to have the response release it. 107s 107s :param preload_content: 107s If True, the response's body will be preloaded during construction. 107s 107s :param decode_content: 107s If True, will attempt to decode the body based on the 107s 'content-encoding' header. 107s 107s :param enforce_content_length: 107s Enforce content length checking. Body returned by server must match 107s value of Content-Length header, if present. Otherwise, raise error. 107s """ 107s self.num_requests += 1 107s 107s timeout_obj = self._get_timeout(timeout) 107s timeout_obj.start_connect() 107s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 107s 107s try: 107s # Trigger any extra validation we need to do. 107s try: 107s self._validate_conn(conn) 107s except (SocketTimeout, BaseSSLError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 107s raise 107s 107s # _validate_conn() starts the connection to an HTTPS proxy 107s # so we need to wrap errors with 'ProxyError' here too. 107s except ( 107s OSError, 107s NewConnectionError, 107s TimeoutError, 107s BaseSSLError, 107s CertificateError, 107s SSLError, 107s ) as e: 107s new_e: Exception = e 107s if isinstance(e, (BaseSSLError, CertificateError)): 107s new_e = SSLError(e) 107s # If the connection didn't successfully connect to it's proxy 107s # then there 107s if isinstance( 107s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 107s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 107s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 107s raise new_e 107s 107s # conn.request() calls http.client.*.request, not the method in 107s # urllib3.request. It also calls makefile (recv) on the socket. 107s try: 107s conn.request( 107s method, 107s url, 107s body=body, 107s headers=headers, 107s chunked=chunked, 107s preload_content=preload_content, 107s decode_content=decode_content, 107s enforce_content_length=enforce_content_length, 107s ) 107s 107s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 107s # legitimately able to close the connection after sending a valid response. 107s # With this behaviour, the received response is still readable. 107s except BrokenPipeError: 107s pass 107s except OSError as e: 107s # MacOS/Linux 107s # EPROTOTYPE and ECONNRESET are needed on macOS 107s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 107s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 107s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 107s raise 107s 107s # Reset the timeout for the recv() on the socket 107s read_timeout = timeout_obj.read_timeout 107s 107s if not conn.is_closed: 107s # In Python 3 socket.py will catch EAGAIN and return None when you 107s # try and read into the file pointer created by http.client, which 107s # instead raises a BadStatusLine exception. Instead of catching 107s # the exception and assuming all BadStatusLine exceptions are read 107s # timeouts, check for a zero timeout before making the request. 107s if read_timeout == 0: 107s raise ReadTimeoutError( 107s self, url, f"Read timed out. (read timeout={read_timeout})" 107s ) 107s conn.timeout = read_timeout 107s 107s # Receive the response from the server 107s try: 107s response = conn.getresponse() 107s except (BaseSSLError, OSError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 107s raise 107s 107s # Set properties that are used by the pooling layer. 107s response.retries = retries 107s response._connection = response_conn # type: ignore[attr-defined] 107s response._pool = self # type: ignore[attr-defined] 107s 107s log.debug( 107s '%s://%s:%s "%s %s %s" %s %s', 107s self.scheme, 107s self.host, 107s self.port, 107s method, 107s url, 107s > response.version_string, 107s response.status, 107s response.length_remaining, 107s ) 107s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 107s 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 107s ----------------------------- Captured stderr call ----------------------------- 107s 127.0.0.1 - - [19/Jan/2025 20:23:21] "GET / HTTP/1.1" 200 9358 107s _______________________________ test_body[https] _______________________________ 107s 107s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_body_https_0') 107s httpbin_both = 107s verify_pool_mgr = 107s 107s def test_body(tmpdir, httpbin_both, verify_pool_mgr): 107s """Ensure the responses are all identical enough""" 107s url = httpbin_both.url + "/bytes/1024" 107s with vcr.use_cassette(str(tmpdir.join("body.yaml"))): 107s > content = verify_pool_mgr.request("GET", url).data 107s 107s tests/integration/test_urllib3.py:55: 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 107s return self.request_encode_url( 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 107s return self.urlopen(method, url, **extra_kw) 107s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 107s response = conn.urlopen(method, u.request_uri, **kw) 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 107s response = self._make_request( 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s 107s self = 107s conn = 107s method = 'GET', url = '/bytes/1024', body = None, headers = {} 107s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 107s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 107s chunked = False, response_conn = None, preload_content = True 107s decode_content = True, enforce_content_length = True 107s 107s def _make_request( 107s self, 107s conn: BaseHTTPConnection, 107s method: str, 107s url: str, 107s body: _TYPE_BODY | None = None, 107s headers: typing.Mapping[str, str] | None = None, 107s retries: Retry | None = None, 107s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 107s chunked: bool = False, 107s response_conn: BaseHTTPConnection | None = None, 107s preload_content: bool = True, 107s decode_content: bool = True, 107s enforce_content_length: bool = True, 107s ) -> BaseHTTPResponse: 107s """ 107s Perform a request on a given urllib connection object taken from our 107s pool. 107s 107s :param conn: 107s a connection from one of our connection pools 107s 107s :param method: 107s HTTP request method (such as GET, POST, PUT, etc.) 107s 107s :param url: 107s The URL to perform the request on. 107s 107s :param body: 107s Data to send in the request body, either :class:`str`, :class:`bytes`, 107s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 107s 107s :param headers: 107s Dictionary of custom headers to send, such as User-Agent, 107s If-None-Match, etc. If None, pool headers are used. If provided, 107s these headers completely replace any pool-specific headers. 107s 107s :param retries: 107s Configure the number of retries to allow before raising a 107s :class:`~urllib3.exceptions.MaxRetryError` exception. 107s 107s Pass ``None`` to retry until you receive a response. Pass a 107s :class:`~urllib3.util.retry.Retry` object for fine-grained control 107s over different types of retries. 107s Pass an integer number to retry connection errors that many times, 107s but no other types of errors. Pass zero to never retry. 107s 107s If ``False``, then retries are disabled and any exception is raised 107s immediately. Also, instead of raising a MaxRetryError on redirects, 107s the redirect response will be returned. 107s 107s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 107s 107s :param timeout: 107s If specified, overrides the default timeout for this one 107s request. It may be a float (in seconds) or an instance of 107s :class:`urllib3.util.Timeout`. 107s 107s :param chunked: 107s If True, urllib3 will send the body using chunked transfer 107s encoding. Otherwise, urllib3 will send the body using the standard 107s content-length form. Defaults to False. 107s 107s :param response_conn: 107s Set this to ``None`` if you will handle releasing the connection or 107s set the connection to have the response release it. 107s 107s :param preload_content: 107s If True, the response's body will be preloaded during construction. 107s 107s :param decode_content: 107s If True, will attempt to decode the body based on the 107s 'content-encoding' header. 107s 107s :param enforce_content_length: 107s Enforce content length checking. Body returned by server must match 107s value of Content-Length header, if present. Otherwise, raise error. 107s """ 107s self.num_requests += 1 107s 107s timeout_obj = self._get_timeout(timeout) 107s timeout_obj.start_connect() 107s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 107s 107s try: 107s # Trigger any extra validation we need to do. 107s try: 107s self._validate_conn(conn) 107s except (SocketTimeout, BaseSSLError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 107s raise 107s 107s # _validate_conn() starts the connection to an HTTPS proxy 107s # so we need to wrap errors with 'ProxyError' here too. 107s except ( 107s OSError, 107s NewConnectionError, 107s TimeoutError, 107s BaseSSLError, 107s CertificateError, 107s SSLError, 107s ) as e: 107s new_e: Exception = e 107s if isinstance(e, (BaseSSLError, CertificateError)): 107s new_e = SSLError(e) 107s # If the connection didn't successfully connect to it's proxy 107s # then there 107s if isinstance( 107s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 107s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 107s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 107s raise new_e 107s 107s # conn.request() calls http.client.*.request, not the method in 107s # urllib3.request. It also calls makefile (recv) on the socket. 107s try: 107s conn.request( 107s method, 107s url, 107s body=body, 107s headers=headers, 107s chunked=chunked, 107s preload_content=preload_content, 107s decode_content=decode_content, 107s enforce_content_length=enforce_content_length, 107s ) 107s 107s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 107s # legitimately able to close the connection after sending a valid response. 107s # With this behaviour, the received response is still readable. 107s except BrokenPipeError: 107s pass 107s except OSError as e: 107s # MacOS/Linux 107s # EPROTOTYPE and ECONNRESET are needed on macOS 107s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 107s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 107s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 107s raise 107s 107s # Reset the timeout for the recv() on the socket 107s read_timeout = timeout_obj.read_timeout 107s 107s if not conn.is_closed: 107s # In Python 3 socket.py will catch EAGAIN and return None when you 107s # try and read into the file pointer created by http.client, which 107s # instead raises a BadStatusLine exception. Instead of catching 107s # the exception and assuming all BadStatusLine exceptions are read 107s # timeouts, check for a zero timeout before making the request. 107s if read_timeout == 0: 107s raise ReadTimeoutError( 107s self, url, f"Read timed out. (read timeout={read_timeout})" 107s ) 107s conn.timeout = read_timeout 107s 107s # Receive the response from the server 107s try: 107s response = conn.getresponse() 107s except (BaseSSLError, OSError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 107s raise 107s 107s # Set properties that are used by the pooling layer. 107s response.retries = retries 107s response._connection = response_conn # type: ignore[attr-defined] 107s response._pool = self # type: ignore[attr-defined] 107s 107s log.debug( 107s '%s://%s:%s "%s %s %s" %s %s', 107s self.scheme, 107s self.host, 107s self.port, 107s method, 107s url, 107s > response.version_string, 107s response.status, 107s response.length_remaining, 107s ) 107s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 107s 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 107s ----------------------------- Captured stderr call ----------------------------- 107s 127.0.0.1 - - [19/Jan/2025 20:23:21] "GET /bytes/1024 HTTP/1.1" 200 1024 107s _______________________________ test_auth[https] _______________________________ 107s 107s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_auth_https_0') 107s httpbin_both = 107s verify_pool_mgr = 107s 107s def test_auth(tmpdir, httpbin_both, verify_pool_mgr): 107s """Ensure that we can handle basic auth""" 107s auth = ("user", "passwd") 107s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 107s url = httpbin_both.url + "/basic-auth/user/passwd" 107s with vcr.use_cassette(str(tmpdir.join("auth.yaml"))): 107s > one = verify_pool_mgr.request("GET", url, headers=headers) 107s 107s tests/integration/test_urllib3.py:67: 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 107s return self.request_encode_url( 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 107s return self.urlopen(method, url, **extra_kw) 107s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 107s response = conn.urlopen(method, u.request_uri, **kw) 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 107s response = self._make_request( 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s 107s self = 107s conn = 107s method = 'GET', url = '/basic-auth/user/passwd', body = None 107s headers = {'authorization': 'Basic dXNlcjpwYXNzd2Q='} 107s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 107s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 107s chunked = False, response_conn = None, preload_content = True 107s decode_content = True, enforce_content_length = True 107s 107s def _make_request( 107s self, 107s conn: BaseHTTPConnection, 107s method: str, 107s url: str, 107s body: _TYPE_BODY | None = None, 107s headers: typing.Mapping[str, str] | None = None, 107s retries: Retry | None = None, 107s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 107s chunked: bool = False, 107s response_conn: BaseHTTPConnection | None = None, 107s preload_content: bool = True, 107s decode_content: bool = True, 107s enforce_content_length: bool = True, 107s ) -> BaseHTTPResponse: 107s """ 107s Perform a request on a given urllib connection object taken from our 107s pool. 107s 107s :param conn: 107s a connection from one of our connection pools 107s 107s :param method: 107s HTTP request method (such as GET, POST, PUT, etc.) 107s 107s :param url: 107s The URL to perform the request on. 107s 107s :param body: 107s Data to send in the request body, either :class:`str`, :class:`bytes`, 107s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 107s 107s :param headers: 107s Dictionary of custom headers to send, such as User-Agent, 107s If-None-Match, etc. If None, pool headers are used. If provided, 107s these headers completely replace any pool-specific headers. 107s 107s :param retries: 107s Configure the number of retries to allow before raising a 107s :class:`~urllib3.exceptions.MaxRetryError` exception. 107s 107s Pass ``None`` to retry until you receive a response. Pass a 107s :class:`~urllib3.util.retry.Retry` object for fine-grained control 107s over different types of retries. 107s Pass an integer number to retry connection errors that many times, 107s but no other types of errors. Pass zero to never retry. 107s 107s If ``False``, then retries are disabled and any exception is raised 107s immediately. Also, instead of raising a MaxRetryError on redirects, 107s the redirect response will be returned. 107s 107s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 107s 107s :param timeout: 107s If specified, overrides the default timeout for this one 107s request. It may be a float (in seconds) or an instance of 107s :class:`urllib3.util.Timeout`. 107s 107s :param chunked: 107s If True, urllib3 will send the body using chunked transfer 107s encoding. Otherwise, urllib3 will send the body using the standard 107s content-length form. Defaults to False. 107s 107s :param response_conn: 107s Set this to ``None`` if you will handle releasing the connection or 107s set the connection to have the response release it. 107s 107s :param preload_content: 107s If True, the response's body will be preloaded during construction. 107s 107s :param decode_content: 107s If True, will attempt to decode the body based on the 107s 'content-encoding' header. 107s 107s :param enforce_content_length: 107s Enforce content length checking. Body returned by server must match 107s value of Content-Length header, if present. Otherwise, raise error. 107s """ 107s self.num_requests += 1 107s 107s timeout_obj = self._get_timeout(timeout) 107s timeout_obj.start_connect() 107s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 107s 107s try: 107s # Trigger any extra validation we need to do. 107s try: 107s self._validate_conn(conn) 107s except (SocketTimeout, BaseSSLError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 107s raise 107s 107s # _validate_conn() starts the connection to an HTTPS proxy 107s # so we need to wrap errors with 'ProxyError' here too. 107s except ( 107s OSError, 107s NewConnectionError, 107s TimeoutError, 107s BaseSSLError, 107s CertificateError, 107s SSLError, 107s ) as e: 107s new_e: Exception = e 107s if isinstance(e, (BaseSSLError, CertificateError)): 107s new_e = SSLError(e) 107s # If the connection didn't successfully connect to it's proxy 107s # then there 107s if isinstance( 107s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 107s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 107s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 107s raise new_e 107s 107s # conn.request() calls http.client.*.request, not the method in 107s # urllib3.request. It also calls makefile (recv) on the socket. 107s try: 107s conn.request( 107s method, 107s url, 107s body=body, 107s headers=headers, 107s chunked=chunked, 107s preload_content=preload_content, 107s decode_content=decode_content, 107s enforce_content_length=enforce_content_length, 107s ) 107s 107s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 107s # legitimately able to close the connection after sending a valid response. 107s # With this behaviour, the received response is still readable. 107s except BrokenPipeError: 107s pass 107s except OSError as e: 107s # MacOS/Linux 107s # EPROTOTYPE and ECONNRESET are needed on macOS 107s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 107s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 107s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 107s raise 107s 107s # Reset the timeout for the recv() on the socket 107s read_timeout = timeout_obj.read_timeout 107s 107s if not conn.is_closed: 107s # In Python 3 socket.py will catch EAGAIN and return None when you 107s # try and read into the file pointer created by http.client, which 107s # instead raises a BadStatusLine exception. Instead of catching 107s # the exception and assuming all BadStatusLine exceptions are read 107s # timeouts, check for a zero timeout before making the request. 107s if read_timeout == 0: 107s raise ReadTimeoutError( 107s self, url, f"Read timed out. (read timeout={read_timeout})" 107s ) 107s conn.timeout = read_timeout 107s 107s # Receive the response from the server 107s try: 107s response = conn.getresponse() 107s except (BaseSSLError, OSError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 107s raise 107s 107s # Set properties that are used by the pooling layer. 107s response.retries = retries 107s response._connection = response_conn # type: ignore[attr-defined] 107s response._pool = self # type: ignore[attr-defined] 107s 107s log.debug( 107s '%s://%s:%s "%s %s %s" %s %s', 107s self.scheme, 107s self.host, 107s self.port, 107s method, 107s url, 107s > response.version_string, 107s response.status, 107s response.length_remaining, 107s ) 107s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 107s 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 107s ----------------------------- Captured stderr call ----------------------------- 107s 127.0.0.1 - - [19/Jan/2025 20:23:21] "GET /basic-auth/user/passwd HTTP/1.1" 200 46 107s ___________________________ test_auth_failed[https] ____________________________ 107s 107s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_auth_failed_https_0') 107s httpbin_both = 107s verify_pool_mgr = 107s 107s def test_auth_failed(tmpdir, httpbin_both, verify_pool_mgr): 107s """Ensure that we can save failed auth statuses""" 107s auth = ("user", "wrongwrongwrong") 107s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 107s url = httpbin_both.url + "/basic-auth/user/passwd" 107s with vcr.use_cassette(str(tmpdir.join("auth-failed.yaml"))) as cass: 107s # Ensure that this is empty to begin with 107s assert_cassette_empty(cass) 107s > one = verify_pool_mgr.request("GET", url, headers=headers) 107s 107s tests/integration/test_urllib3.py:83: 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 107s return self.request_encode_url( 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 107s return self.urlopen(method, url, **extra_kw) 107s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 107s response = conn.urlopen(method, u.request_uri, **kw) 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 107s response = self._make_request( 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s 107s self = 107s conn = 107s method = 'GET', url = '/basic-auth/user/passwd', body = None 107s headers = {'authorization': 'Basic dXNlcjp3cm9uZ3dyb25nd3Jvbmc='} 107s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 107s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 107s chunked = False, response_conn = None, preload_content = True 107s decode_content = True, enforce_content_length = True 107s 107s def _make_request( 107s self, 107s conn: BaseHTTPConnection, 107s method: str, 107s url: str, 107s body: _TYPE_BODY | None = None, 107s headers: typing.Mapping[str, str] | None = None, 107s retries: Retry | None = None, 107s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 107s chunked: bool = False, 107s response_conn: BaseHTTPConnection | None = None, 107s preload_content: bool = True, 107s decode_content: bool = True, 107s enforce_content_length: bool = True, 107s ) -> BaseHTTPResponse: 107s """ 107s Perform a request on a given urllib connection object taken from our 107s pool. 107s 107s :param conn: 107s a connection from one of our connection pools 107s 107s :param method: 107s HTTP request method (such as GET, POST, PUT, etc.) 107s 107s :param url: 107s The URL to perform the request on. 107s 107s :param body: 107s Data to send in the request body, either :class:`str`, :class:`bytes`, 107s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 107s 107s :param headers: 107s Dictionary of custom headers to send, such as User-Agent, 107s If-None-Match, etc. If None, pool headers are used. If provided, 107s these headers completely replace any pool-specific headers. 107s 107s :param retries: 107s Configure the number of retries to allow before raising a 107s :class:`~urllib3.exceptions.MaxRetryError` exception. 107s 107s Pass ``None`` to retry until you receive a response. Pass a 107s :class:`~urllib3.util.retry.Retry` object for fine-grained control 107s over different types of retries. 107s Pass an integer number to retry connection errors that many times, 107s but no other types of errors. Pass zero to never retry. 107s 107s If ``False``, then retries are disabled and any exception is raised 107s immediately. Also, instead of raising a MaxRetryError on redirects, 107s the redirect response will be returned. 107s 107s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 107s 107s :param timeout: 107s If specified, overrides the default timeout for this one 107s request. It may be a float (in seconds) or an instance of 107s :class:`urllib3.util.Timeout`. 107s 107s :param chunked: 107s If True, urllib3 will send the body using chunked transfer 107s encoding. Otherwise, urllib3 will send the body using the standard 107s content-length form. Defaults to False. 107s 107s :param response_conn: 107s Set this to ``None`` if you will handle releasing the connection or 107s set the connection to have the response release it. 107s 107s :param preload_content: 107s If True, the response's body will be preloaded during construction. 107s 107s :param decode_content: 107s If True, will attempt to decode the body based on the 107s 'content-encoding' header. 107s 107s :param enforce_content_length: 107s Enforce content length checking. Body returned by server must match 107s value of Content-Length header, if present. Otherwise, raise error. 107s """ 107s self.num_requests += 1 107s 107s timeout_obj = self._get_timeout(timeout) 107s timeout_obj.start_connect() 107s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 107s 107s try: 107s # Trigger any extra validation we need to do. 107s try: 107s self._validate_conn(conn) 107s except (SocketTimeout, BaseSSLError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 107s raise 107s 107s # _validate_conn() starts the connection to an HTTPS proxy 107s # so we need to wrap errors with 'ProxyError' here too. 107s except ( 107s OSError, 107s NewConnectionError, 107s TimeoutError, 107s BaseSSLError, 107s CertificateError, 107s SSLError, 107s ) as e: 107s new_e: Exception = e 107s if isinstance(e, (BaseSSLError, CertificateError)): 107s new_e = SSLError(e) 107s # If the connection didn't successfully connect to it's proxy 107s # then there 107s if isinstance( 107s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 107s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 107s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 107s raise new_e 107s 107s # conn.request() calls http.client.*.request, not the method in 107s # urllib3.request. It also calls makefile (recv) on the socket. 107s try: 107s conn.request( 107s method, 107s url, 107s body=body, 107s headers=headers, 107s chunked=chunked, 107s preload_content=preload_content, 107s decode_content=decode_content, 107s enforce_content_length=enforce_content_length, 107s ) 107s 107s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 107s # legitimately able to close the connection after sending a valid response. 107s # With this behaviour, the received response is still readable. 107s except BrokenPipeError: 107s pass 107s except OSError as e: 107s # MacOS/Linux 107s # EPROTOTYPE and ECONNRESET are needed on macOS 107s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 107s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 107s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 107s raise 107s 107s # Reset the timeout for the recv() on the socket 107s read_timeout = timeout_obj.read_timeout 107s 107s if not conn.is_closed: 107s # In Python 3 socket.py will catch EAGAIN and return None when you 107s # try and read into the file pointer created by http.client, which 107s # instead raises a BadStatusLine exception. Instead of catching 107s # the exception and assuming all BadStatusLine exceptions are read 107s # timeouts, check for a zero timeout before making the request. 107s if read_timeout == 0: 107s raise ReadTimeoutError( 107s self, url, f"Read timed out. (read timeout={read_timeout})" 107s ) 107s conn.timeout = read_timeout 107s 107s # Receive the response from the server 107s try: 107s response = conn.getresponse() 107s except (BaseSSLError, OSError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 107s raise 107s 107s # Set properties that are used by the pooling layer. 107s response.retries = retries 107s response._connection = response_conn # type: ignore[attr-defined] 107s response._pool = self # type: ignore[attr-defined] 107s 107s log.debug( 107s '%s://%s:%s "%s %s %s" %s %s', 107s self.scheme, 107s self.host, 107s self.port, 107s method, 107s url, 107s > response.version_string, 107s response.status, 107s response.length_remaining, 107s ) 107s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 107s 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 107s ----------------------------- Captured stderr call ----------------------------- 107s 127.0.0.1 - - [19/Jan/2025 20:23:21] "GET /basic-auth/user/passwd HTTP/1.1" 401 0 107s _______________________________ test_post[https] _______________________________ 107s 107s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_post_https_0') 107s httpbin_both = 107s verify_pool_mgr = 107s 107s def test_post(tmpdir, httpbin_both, verify_pool_mgr): 107s """Ensure that we can post and cache the results""" 107s data = {"key1": "value1", "key2": "value2"} 107s url = httpbin_both.url + "/post" 107s with vcr.use_cassette(str(tmpdir.join("verify_pool_mgr.yaml"))): 107s > req1 = verify_pool_mgr.request("POST", url, data).data 107s 107s tests/integration/test_urllib3.py:94: 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:143: in request 107s return self.request_encode_body( 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:278: in request_encode_body 107s return self.urlopen(method, url, **extra_kw) 107s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 107s response = conn.urlopen(method, u.request_uri, **kw) 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 107s response = self._make_request( 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s 107s self = 107s conn = 107s method = 'POST', url = '/post', body = {'key1': 'value1', 'key2': 'value2'} 107s headers = HTTPHeaderDict({}) 107s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 107s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 107s chunked = False, response_conn = None, preload_content = True 107s decode_content = True, enforce_content_length = True 107s 107s def _make_request( 107s self, 107s conn: BaseHTTPConnection, 107s method: str, 107s url: str, 107s body: _TYPE_BODY | None = None, 107s headers: typing.Mapping[str, str] | None = None, 107s retries: Retry | None = None, 107s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 107s chunked: bool = False, 107s response_conn: BaseHTTPConnection | None = None, 107s preload_content: bool = True, 107s decode_content: bool = True, 107s enforce_content_length: bool = True, 107s ) -> BaseHTTPResponse: 107s """ 107s Perform a request on a given urllib connection object taken from our 107s pool. 107s 107s :param conn: 107s a connection from one of our connection pools 107s 107s :param method: 107s HTTP request method (such as GET, POST, PUT, etc.) 107s 107s :param url: 107s The URL to perform the request on. 107s 107s :param body: 107s Data to send in the request body, either :class:`str`, :class:`bytes`, 107s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 107s 107s :param headers: 107s Dictionary of custom headers to send, such as User-Agent, 107s If-None-Match, etc. If None, pool headers are used. If provided, 107s these headers completely replace any pool-specific headers. 107s 107s :param retries: 107s Configure the number of retries to allow before raising a 107s :class:`~urllib3.exceptions.MaxRetryError` exception. 107s 107s Pass ``None`` to retry until you receive a response. Pass a 107s :class:`~urllib3.util.retry.Retry` object for fine-grained control 107s over different types of retries. 107s Pass an integer number to retry connection errors that many times, 107s but no other types of errors. Pass zero to never retry. 107s 107s If ``False``, then retries are disabled and any exception is raised 107s immediately. Also, instead of raising a MaxRetryError on redirects, 107s the redirect response will be returned. 107s 107s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 107s 107s :param timeout: 107s If specified, overrides the default timeout for this one 107s request. It may be a float (in seconds) or an instance of 107s :class:`urllib3.util.Timeout`. 107s 107s :param chunked: 107s If True, urllib3 will send the body using chunked transfer 107s encoding. Otherwise, urllib3 will send the body using the standard 107s content-length form. Defaults to False. 107s 107s :param response_conn: 107s Set this to ``None`` if you will handle releasing the connection or 107s set the connection to have the response release it. 107s 107s :param preload_content: 107s If True, the response's body will be preloaded during construction. 107s 107s :param decode_content: 107s If True, will attempt to decode the body based on the 107s 'content-encoding' header. 107s 107s :param enforce_content_length: 107s Enforce content length checking. Body returned by server must match 107s value of Content-Length header, if present. Otherwise, raise error. 107s """ 107s self.num_requests += 1 107s 107s timeout_obj = self._get_timeout(timeout) 107s timeout_obj.start_connect() 107s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 107s 107s try: 107s # Trigger any extra validation we need to do. 107s try: 107s self._validate_conn(conn) 107s except (SocketTimeout, BaseSSLError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 107s raise 107s 107s # _validate_conn() starts the connection to an HTTPS proxy 107s # so we need to wrap errors with 'ProxyError' here too. 107s except ( 107s OSError, 107s NewConnectionError, 107s TimeoutError, 107s BaseSSLError, 107s CertificateError, 107s SSLError, 107s ) as e: 107s new_e: Exception = e 107s if isinstance(e, (BaseSSLError, CertificateError)): 107s new_e = SSLError(e) 107s # If the connection didn't successfully connect to it's proxy 107s # then there 107s if isinstance( 107s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 107s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 107s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 107s raise new_e 107s 107s # conn.request() calls http.client.*.request, not the method in 107s # urllib3.request. It also calls makefile (recv) on the socket. 107s try: 107s conn.request( 107s method, 107s url, 107s body=body, 107s headers=headers, 107s chunked=chunked, 107s preload_content=preload_content, 107s decode_content=decode_content, 107s enforce_content_length=enforce_content_length, 107s ) 107s 107s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 107s # legitimately able to close the connection after sending a valid response. 107s # With this behaviour, the received response is still readable. 107s except BrokenPipeError: 107s pass 107s except OSError as e: 107s # MacOS/Linux 107s # EPROTOTYPE and ECONNRESET are needed on macOS 107s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 107s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 107s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 107s raise 107s 107s # Reset the timeout for the recv() on the socket 107s read_timeout = timeout_obj.read_timeout 107s 107s if not conn.is_closed: 107s # In Python 3 socket.py will catch EAGAIN and return None when you 107s # try and read into the file pointer created by http.client, which 107s # instead raises a BadStatusLine exception. Instead of catching 107s # the exception and assuming all BadStatusLine exceptions are read 107s # timeouts, check for a zero timeout before making the request. 107s if read_timeout == 0: 107s raise ReadTimeoutError( 107s self, url, f"Read timed out. (read timeout={read_timeout})" 107s ) 107s conn.timeout = read_timeout 107s 107s # Receive the response from the server 107s try: 107s response = conn.getresponse() 107s except (BaseSSLError, OSError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 107s raise 107s 107s # Set properties that are used by the pooling layer. 107s response.retries = retries 107s response._connection = response_conn # type: ignore[attr-defined] 107s response._pool = self # type: ignore[attr-defined] 107s 107s log.debug( 107s '%s://%s:%s "%s %s %s" %s %s', 107s self.scheme, 107s self.host, 107s self.port, 107s method, 107s url, 107s > response.version_string, 107s response.status, 107s response.length_remaining, 107s ) 107s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 107s 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 107s ----------------------------- Captured stderr call ----------------------------- 107s 127.0.0.1 - - [19/Jan/2025 20:23:21] "POST /post HTTP/1.1" 501 159 107s _______________________________ test_gzip[https] _______________________________ 107s 107s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_gzip_https_0') 107s httpbin_both = 107s verify_pool_mgr = 107s 107s def test_gzip(tmpdir, httpbin_both, verify_pool_mgr): 107s """ 107s Ensure that requests (actually urllib3) is able to automatically decompress 107s the response body 107s """ 107s url = httpbin_both.url + "/gzip" 107s response = verify_pool_mgr.request("GET", url) 107s 107s with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))): 107s > response = verify_pool_mgr.request("GET", url) 107s 107s tests/integration/test_urllib3.py:140: 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 107s return self.request_encode_url( 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 107s return self.urlopen(method, url, **extra_kw) 107s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 107s response = conn.urlopen(method, u.request_uri, **kw) 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 107s response = self._make_request( 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s 107s self = 107s conn = 107s method = 'GET', url = '/gzip', body = None, headers = {} 107s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 107s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 107s chunked = False, response_conn = None, preload_content = True 107s decode_content = True, enforce_content_length = True 107s 107s def _make_request( 107s self, 107s conn: BaseHTTPConnection, 107s method: str, 107s url: str, 107s body: _TYPE_BODY | None = None, 107s headers: typing.Mapping[str, str] | None = None, 107s retries: Retry | None = None, 107s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 107s chunked: bool = False, 107s response_conn: BaseHTTPConnection | None = None, 107s preload_content: bool = True, 107s decode_content: bool = True, 107s enforce_content_length: bool = True, 107s ) -> BaseHTTPResponse: 107s """ 107s Perform a request on a given urllib connection object taken from our 107s pool. 107s 107s :param conn: 107s a connection from one of our connection pools 107s 107s :param method: 107s HTTP request method (such as GET, POST, PUT, etc.) 107s 107s :param url: 107s The URL to perform the request on. 107s 107s :param body: 107s Data to send in the request body, either :class:`str`, :class:`bytes`, 107s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 107s 107s :param headers: 107s Dictionary of custom headers to send, such as User-Agent, 107s If-None-Match, etc. If None, pool headers are used. If provided, 107s these headers completely replace any pool-specific headers. 107s 107s :param retries: 107s Configure the number of retries to allow before raising a 107s :class:`~urllib3.exceptions.MaxRetryError` exception. 107s 107s Pass ``None`` to retry until you receive a response. Pass a 107s :class:`~urllib3.util.retry.Retry` object for fine-grained control 107s over different types of retries. 107s Pass an integer number to retry connection errors that many times, 107s but no other types of errors. Pass zero to never retry. 107s 107s If ``False``, then retries are disabled and any exception is raised 107s immediately. Also, instead of raising a MaxRetryError on redirects, 107s the redirect response will be returned. 107s 107s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 107s 107s :param timeout: 107s If specified, overrides the default timeout for this one 107s request. It may be a float (in seconds) or an instance of 107s :class:`urllib3.util.Timeout`. 107s 107s :param chunked: 107s If True, urllib3 will send the body using chunked transfer 107s encoding. Otherwise, urllib3 will send the body using the standard 107s content-length form. Defaults to False. 107s 107s :param response_conn: 107s Set this to ``None`` if you will handle releasing the connection or 107s set the connection to have the response release it. 107s 107s :param preload_content: 107s If True, the response's body will be preloaded during construction. 107s 107s :param decode_content: 107s If True, will attempt to decode the body based on the 107s 'content-encoding' header. 107s 107s :param enforce_content_length: 107s Enforce content length checking. Body returned by server must match 107s value of Content-Length header, if present. Otherwise, raise error. 107s """ 107s self.num_requests += 1 107s 107s timeout_obj = self._get_timeout(timeout) 107s timeout_obj.start_connect() 107s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 107s 107s try: 107s # Trigger any extra validation we need to do. 107s try: 107s self._validate_conn(conn) 107s except (SocketTimeout, BaseSSLError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 107s raise 107s 107s # _validate_conn() starts the connection to an HTTPS proxy 107s # so we need to wrap errors with 'ProxyError' here too. 107s except ( 107s OSError, 107s NewConnectionError, 107s TimeoutError, 107s BaseSSLError, 107s CertificateError, 107s SSLError, 107s ) as e: 107s new_e: Exception = e 107s if isinstance(e, (BaseSSLError, CertificateError)): 107s new_e = SSLError(e) 107s # If the connection didn't successfully connect to it's proxy 107s # then there 107s if isinstance( 107s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 107s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 107s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 107s raise new_e 107s 107s # conn.request() calls http.client.*.request, not the method in 107s # urllib3.request. It also calls makefile (recv) on the socket. 107s try: 107s conn.request( 107s method, 107s url, 107s body=body, 107s headers=headers, 107s chunked=chunked, 107s preload_content=preload_content, 107s decode_content=decode_content, 107s enforce_content_length=enforce_content_length, 107s ) 107s 107s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 107s # legitimately able to close the connection after sending a valid response. 107s # With this behaviour, the received response is still readable. 107s except BrokenPipeError: 107s pass 107s except OSError as e: 107s # MacOS/Linux 107s # EPROTOTYPE and ECONNRESET are needed on macOS 107s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 107s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 107s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 107s raise 107s 107s # Reset the timeout for the recv() on the socket 107s read_timeout = timeout_obj.read_timeout 107s 107s if not conn.is_closed: 107s # In Python 3 socket.py will catch EAGAIN and return None when you 107s # try and read into the file pointer created by http.client, which 107s # instead raises a BadStatusLine exception. Instead of catching 107s # the exception and assuming all BadStatusLine exceptions are read 107s # timeouts, check for a zero timeout before making the request. 107s if read_timeout == 0: 107s raise ReadTimeoutError( 107s self, url, f"Read timed out. (read timeout={read_timeout})" 107s ) 107s conn.timeout = read_timeout 107s 107s # Receive the response from the server 107s try: 107s response = conn.getresponse() 107s except (BaseSSLError, OSError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 107s raise 107s 107s # Set properties that are used by the pooling layer. 107s response.retries = retries 107s response._connection = response_conn # type: ignore[attr-defined] 107s response._pool = self # type: ignore[attr-defined] 107s 107s log.debug( 107s '%s://%s:%s "%s %s %s" %s %s', 107s self.scheme, 107s self.host, 107s self.port, 107s method, 107s url, 107s > response.version_string, 107s response.status, 107s response.length_remaining, 107s ) 107s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 107s 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 107s ----------------------------- Captured stderr call ----------------------------- 107s 127.0.0.1 - - [19/Jan/2025 20:23:21] "GET /gzip HTTP/1.1" 200 165 107s 127.0.0.1 - - [19/Jan/2025 20:23:21] "GET /gzip HTTP/1.1" 200 165 107s ________________________________ test_use_proxy ________________________________ 107s 107s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_use_proxy0') 107s httpbin = 107s proxy_server = 'http://0.0.0.0:51065' 107s 107s def test_use_proxy(tmpdir, httpbin, proxy_server): 107s """Ensure that it works with a proxy.""" 107s with vcr.use_cassette(str(tmpdir.join("proxy.yaml"))): 107s > response = requests.get(httpbin.url, proxies={"http": proxy_server}) 107s 107s tests/integration/test_proxy.py:53: 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s /usr/lib/python3/dist-packages/requests/api.py:73: in get 107s return request("get", url, params=params, **kwargs) 107s /usr/lib/python3/dist-packages/requests/api.py:59: in request 107s return session.request(method=method, url=url, **kwargs) 107s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 107s resp = self.send(prep, **send_kwargs) 107s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 107s r = adapter.send(request, **kwargs) 107s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 107s resp = conn.urlopen( 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 107s response = self._make_request( 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s 107s self = 107s conn = 107s method = 'GET', url = 'http://127.0.0.1:33485/', body = None 107s headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 107s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 107s timeout = Timeout(connect=None, read=None, total=None), chunked = False 107s response_conn = 107s preload_content = False, decode_content = False, enforce_content_length = True 107s 107s def _make_request( 107s self, 107s conn: BaseHTTPConnection, 107s method: str, 107s url: str, 107s body: _TYPE_BODY | None = None, 107s headers: typing.Mapping[str, str] | None = None, 107s retries: Retry | None = None, 107s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 107s chunked: bool = False, 107s response_conn: BaseHTTPConnection | None = None, 107s preload_content: bool = True, 107s decode_content: bool = True, 107s enforce_content_length: bool = True, 107s ) -> BaseHTTPResponse: 107s """ 107s Perform a request on a given urllib connection object taken from our 107s pool. 107s 107s :param conn: 107s a connection from one of our connection pools 107s 107s :param method: 107s HTTP request method (such as GET, POST, PUT, etc.) 107s 107s :param url: 107s The URL to perform the request on. 107s 107s :param body: 107s Data to send in the request body, either :class:`str`, :class:`bytes`, 107s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 107s 107s :param headers: 107s Dictionary of custom headers to send, such as User-Agent, 107s If-None-Match, etc. If None, pool headers are used. If provided, 107s these headers completely replace any pool-specific headers. 107s 107s :param retries: 107s Configure the number of retries to allow before raising a 107s :class:`~urllib3.exceptions.MaxRetryError` exception. 107s 107s Pass ``None`` to retry until you receive a response. Pass a 107s :class:`~urllib3.util.retry.Retry` object for fine-grained control 107s over different types of retries. 107s Pass an integer number to retry connection errors that many times, 107s but no other types of errors. Pass zero to never retry. 107s 107s If ``False``, then retries are disabled and any exception is raised 107s immediately. Also, instead of raising a MaxRetryError on redirects, 107s the redirect response will be returned. 107s 107s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 107s 107s :param timeout: 107s If specified, overrides the default timeout for this one 107s request. It may be a float (in seconds) or an instance of 107s :class:`urllib3.util.Timeout`. 107s 107s :param chunked: 107s If True, urllib3 will send the body using chunked transfer 107s encoding. Otherwise, urllib3 will send the body using the standard 107s content-length form. Defaults to False. 107s 107s :param response_conn: 107s Set this to ``None`` if you will handle releasing the connection or 107s set the connection to have the response release it. 107s 107s :param preload_content: 107s If True, the response's body will be preloaded during construction. 107s 107s :param decode_content: 107s If True, will attempt to decode the body based on the 107s 'content-encoding' header. 107s 107s :param enforce_content_length: 107s Enforce content length checking. Body returned by server must match 107s value of Content-Length header, if present. Otherwise, raise error. 107s """ 107s self.num_requests += 1 107s 107s timeout_obj = self._get_timeout(timeout) 107s timeout_obj.start_connect() 107s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 107s 107s try: 107s # Trigger any extra validation we need to do. 107s try: 107s self._validate_conn(conn) 107s except (SocketTimeout, BaseSSLError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 107s raise 107s 107s # _validate_conn() starts the connection to an HTTPS proxy 107s # so we need to wrap errors with 'ProxyError' here too. 107s except ( 107s OSError, 107s NewConnectionError, 107s TimeoutError, 107s BaseSSLError, 107s CertificateError, 107s SSLError, 107s ) as e: 107s new_e: Exception = e 107s if isinstance(e, (BaseSSLError, CertificateError)): 107s new_e = SSLError(e) 107s # If the connection didn't successfully connect to it's proxy 107s # then there 107s if isinstance( 107s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 107s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 107s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 107s raise new_e 107s 107s # conn.request() calls http.client.*.request, not the method in 107s # urllib3.request. It also calls makefile (recv) on the socket. 107s try: 107s conn.request( 107s method, 107s url, 107s body=body, 107s headers=headers, 107s chunked=chunked, 107s preload_content=preload_content, 107s decode_content=decode_content, 107s enforce_content_length=enforce_content_length, 107s ) 107s 107s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 107s # legitimately able to close the connection after sending a valid response. 107s # With this behaviour, the received response is still readable. 107s except BrokenPipeError: 107s pass 107s except OSError as e: 107s # MacOS/Linux 107s # EPROTOTYPE and ECONNRESET are needed on macOS 107s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 107s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 107s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 107s raise 107s 107s # Reset the timeout for the recv() on the socket 107s read_timeout = timeout_obj.read_timeout 107s 107s if not conn.is_closed: 107s # In Python 3 socket.py will catch EAGAIN and return None when you 107s # try and read into the file pointer created by http.client, which 107s # instead raises a BadStatusLine exception. Instead of catching 107s # the exception and assuming all BadStatusLine exceptions are read 107s # timeouts, check for a zero timeout before making the request. 107s if read_timeout == 0: 107s raise ReadTimeoutError( 107s self, url, f"Read timed out. (read timeout={read_timeout})" 107s ) 107s conn.timeout = read_timeout 107s 107s # Receive the response from the server 107s try: 107s response = conn.getresponse() 107s except (BaseSSLError, OSError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 107s raise 107s 107s # Set properties that are used by the pooling layer. 107s response.retries = retries 107s response._connection = response_conn # type: ignore[attr-defined] 107s response._pool = self # type: ignore[attr-defined] 107s 107s log.debug( 107s '%s://%s:%s "%s %s %s" %s %s', 107s self.scheme, 107s self.host, 107s self.port, 107s method, 107s url, 107s > response.version_string, 107s response.status, 107s response.length_remaining, 107s ) 107s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 107s 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 107s ----------------------------- Captured stderr call ----------------------------- 107s 127.0.0.1 - - [19/Jan/2025 20:23:22] "GET / HTTP/1.1" 200 9358 107s 127.0.0.1 - - [19/Jan/2025 20:23:22] "GET http://127.0.0.1:33485/ HTTP/1.1" 200 - 107s ______________________________ test_cross_scheme _______________________________ 107s 107s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_cross_scheme2') 107s httpbin = 107s httpbin_secure = 107s verify_pool_mgr = 107s 107s def test_cross_scheme(tmpdir, httpbin, httpbin_secure, verify_pool_mgr): 107s """Ensure that requests between schemes are treated separately""" 107s # First fetch a url under http, and then again under https and then 107s # ensure that we haven't served anything out of cache, and we have two 107s # requests / response pairs in the cassette 107s with vcr.use_cassette(str(tmpdir.join("cross_scheme.yaml"))) as cass: 107s > verify_pool_mgr.request("GET", httpbin_secure.url) 107s 107s tests/integration/test_urllib3.py:125: 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 107s return self.request_encode_url( 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 107s return self.urlopen(method, url, **extra_kw) 107s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 107s response = conn.urlopen(method, u.request_uri, **kw) 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 107s response = self._make_request( 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s 107s self = 107s conn = 107s method = 'GET', url = '/', body = None, headers = {} 107s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 107s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 107s chunked = False, response_conn = None, preload_content = True 107s decode_content = True, enforce_content_length = True 107s 107s def _make_request( 107s self, 107s conn: BaseHTTPConnection, 107s method: str, 107s url: str, 107s body: _TYPE_BODY | None = None, 107s headers: typing.Mapping[str, str] | None = None, 107s retries: Retry | None = None, 107s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 107s chunked: bool = False, 107s response_conn: BaseHTTPConnection | None = None, 107s preload_content: bool = True, 107s decode_content: bool = True, 107s enforce_content_length: bool = True, 107s ) -> BaseHTTPResponse: 107s """ 107s Perform a request on a given urllib connection object taken from our 107s pool. 107s 107s :param conn: 107s a connection from one of our connection pools 107s 107s :param method: 107s HTTP request method (such as GET, POST, PUT, etc.) 107s 107s :param url: 107s The URL to perform the request on. 107s 107s :param body: 107s Data to send in the request body, either :class:`str`, :class:`bytes`, 107s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 107s 107s :param headers: 107s Dictionary of custom headers to send, such as User-Agent, 107s If-None-Match, etc. If None, pool headers are used. If provided, 107s these headers completely replace any pool-specific headers. 107s 107s :param retries: 107s Configure the number of retries to allow before raising a 107s :class:`~urllib3.exceptions.MaxRetryError` exception. 107s 107s Pass ``None`` to retry until you receive a response. Pass a 107s :class:`~urllib3.util.retry.Retry` object for fine-grained control 107s over different types of retries. 107s Pass an integer number to retry connection errors that many times, 107s but no other types of errors. Pass zero to never retry. 107s 107s If ``False``, then retries are disabled and any exception is raised 107s immediately. Also, instead of raising a MaxRetryError on redirects, 107s the redirect response will be returned. 107s 107s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 107s 107s :param timeout: 107s If specified, overrides the default timeout for this one 107s request. It may be a float (in seconds) or an instance of 107s :class:`urllib3.util.Timeout`. 107s 107s :param chunked: 107s If True, urllib3 will send the body using chunked transfer 107s encoding. Otherwise, urllib3 will send the body using the standard 107s content-length form. Defaults to False. 107s 107s :param response_conn: 107s Set this to ``None`` if you will handle releasing the connection or 107s set the connection to have the response release it. 107s 107s :param preload_content: 107s If True, the response's body will be preloaded during construction. 107s 107s :param decode_content: 107s If True, will attempt to decode the body based on the 107s 'content-encoding' header. 107s 107s :param enforce_content_length: 107s Enforce content length checking. Body returned by server must match 107s value of Content-Length header, if present. Otherwise, raise error. 107s """ 107s self.num_requests += 1 107s 107s timeout_obj = self._get_timeout(timeout) 107s timeout_obj.start_connect() 107s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 107s 107s try: 107s # Trigger any extra validation we need to do. 107s try: 107s self._validate_conn(conn) 107s except (SocketTimeout, BaseSSLError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 107s raise 107s 107s # _validate_conn() starts the connection to an HTTPS proxy 107s # so we need to wrap errors with 'ProxyError' here too. 107s except ( 107s OSError, 107s NewConnectionError, 107s TimeoutError, 107s BaseSSLError, 107s CertificateError, 107s SSLError, 107s ) as e: 107s new_e: Exception = e 107s if isinstance(e, (BaseSSLError, CertificateError)): 107s new_e = SSLError(e) 107s # If the connection didn't successfully connect to it's proxy 107s # then there 107s if isinstance( 107s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 107s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 107s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 107s raise new_e 107s 107s # conn.request() calls http.client.*.request, not the method in 107s # urllib3.request. It also calls makefile (recv) on the socket. 107s try: 107s conn.request( 107s method, 107s url, 107s body=body, 107s headers=headers, 107s chunked=chunked, 107s preload_content=preload_content, 107s decode_content=decode_content, 107s enforce_content_length=enforce_content_length, 107s ) 107s 107s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 107s # legitimately able to close the connection after sending a valid response. 107s # With this behaviour, the received response is still readable. 107s except BrokenPipeError: 107s pass 107s except OSError as e: 107s # MacOS/Linux 107s # EPROTOTYPE and ECONNRESET are needed on macOS 107s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 107s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 107s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 107s raise 107s 107s # Reset the timeout for the recv() on the socket 107s read_timeout = timeout_obj.read_timeout 107s 107s if not conn.is_closed: 107s # In Python 3 socket.py will catch EAGAIN and return None when you 107s # try and read into the file pointer created by http.client, which 107s # instead raises a BadStatusLine exception. Instead of catching 107s # the exception and assuming all BadStatusLine exceptions are read 107s # timeouts, check for a zero timeout before making the request. 107s if read_timeout == 0: 107s raise ReadTimeoutError( 107s self, url, f"Read timed out. (read timeout={read_timeout})" 107s ) 107s conn.timeout = read_timeout 107s 107s # Receive the response from the server 107s try: 107s response = conn.getresponse() 107s except (BaseSSLError, OSError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 107s raise 107s 107s # Set properties that are used by the pooling layer. 107s response.retries = retries 107s response._connection = response_conn # type: ignore[attr-defined] 107s response._pool = self # type: ignore[attr-defined] 107s 107s log.debug( 107s '%s://%s:%s "%s %s %s" %s %s', 107s self.scheme, 107s self.host, 107s self.port, 107s method, 107s url, 107s > response.version_string, 107s response.status, 107s response.length_remaining, 107s ) 107s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 107s 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 107s ----------------------------- Captured stderr call ----------------------------- 107s 127.0.0.1 - - [19/Jan/2025 20:23:22] "GET / HTTP/1.1" 200 9358 107s ___________________ test_https_with_cert_validation_disabled ___________________ 107s 107s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_https_with_cert_validatio0') 107s httpbin_secure = 107s pool_mgr = 107s 107s def test_https_with_cert_validation_disabled(tmpdir, httpbin_secure, pool_mgr): 107s with vcr.use_cassette(str(tmpdir.join("cert_validation_disabled.yaml"))): 107s > pool_mgr.request("GET", httpbin_secure.url) 107s 107s tests/integration/test_urllib3.py:149: 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 107s return self.request_encode_url( 107s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 107s return self.urlopen(method, url, **extra_kw) 107s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 107s response = conn.urlopen(method, u.request_uri, **kw) 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 107s response = self._make_request( 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s 107s self = 107s conn = 107s method = 'GET', url = '/', body = None, headers = {} 107s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 107s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 107s chunked = False, response_conn = None, preload_content = True 107s decode_content = True, enforce_content_length = True 107s 107s def _make_request( 107s self, 107s conn: BaseHTTPConnection, 107s method: str, 107s url: str, 107s body: _TYPE_BODY | None = None, 107s headers: typing.Mapping[str, str] | None = None, 107s retries: Retry | None = None, 107s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 107s chunked: bool = False, 107s response_conn: BaseHTTPConnection | None = None, 107s preload_content: bool = True, 107s decode_content: bool = True, 107s enforce_content_length: bool = True, 107s ) -> BaseHTTPResponse: 107s """ 107s Perform a request on a given urllib connection object taken from our 107s pool. 107s 107s :param conn: 107s a connection from one of our connection pools 107s 107s :param method: 107s HTTP request method (such as GET, POST, PUT, etc.) 107s 107s :param url: 107s The URL to perform the request on. 107s 107s :param body: 107s Data to send in the request body, either :class:`str`, :class:`bytes`, 107s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 107s 107s :param headers: 107s Dictionary of custom headers to send, such as User-Agent, 107s If-None-Match, etc. If None, pool headers are used. If provided, 107s these headers completely replace any pool-specific headers. 107s 107s :param retries: 107s Configure the number of retries to allow before raising a 107s :class:`~urllib3.exceptions.MaxRetryError` exception. 107s 107s Pass ``None`` to retry until you receive a response. Pass a 107s :class:`~urllib3.util.retry.Retry` object for fine-grained control 107s over different types of retries. 107s Pass an integer number to retry connection errors that many times, 107s but no other types of errors. Pass zero to never retry. 107s 107s If ``False``, then retries are disabled and any exception is raised 107s immediately. Also, instead of raising a MaxRetryError on redirects, 107s the redirect response will be returned. 107s 107s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 107s 107s :param timeout: 107s If specified, overrides the default timeout for this one 107s request. It may be a float (in seconds) or an instance of 107s :class:`urllib3.util.Timeout`. 107s 107s :param chunked: 107s If True, urllib3 will send the body using chunked transfer 107s encoding. Otherwise, urllib3 will send the body using the standard 107s content-length form. Defaults to False. 107s 107s :param response_conn: 107s Set this to ``None`` if you will handle releasing the connection or 107s set the connection to have the response release it. 107s 107s :param preload_content: 107s If True, the response's body will be preloaded during construction. 107s 107s :param decode_content: 107s If True, will attempt to decode the body based on the 107s 'content-encoding' header. 107s 107s :param enforce_content_length: 107s Enforce content length checking. Body returned by server must match 107s value of Content-Length header, if present. Otherwise, raise error. 107s """ 107s self.num_requests += 1 107s 107s timeout_obj = self._get_timeout(timeout) 107s timeout_obj.start_connect() 107s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 107s 107s try: 107s # Trigger any extra validation we need to do. 107s try: 107s self._validate_conn(conn) 107s except (SocketTimeout, BaseSSLError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 107s raise 107s 107s # _validate_conn() starts the connection to an HTTPS proxy 107s # so we need to wrap errors with 'ProxyError' here too. 107s except ( 107s OSError, 107s NewConnectionError, 107s TimeoutError, 107s BaseSSLError, 107s CertificateError, 107s SSLError, 107s ) as e: 107s new_e: Exception = e 107s if isinstance(e, (BaseSSLError, CertificateError)): 107s new_e = SSLError(e) 107s # If the connection didn't successfully connect to it's proxy 107s # then there 107s if isinstance( 107s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 107s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 107s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 107s raise new_e 107s 107s # conn.request() calls http.client.*.request, not the method in 107s # urllib3.request. It also calls makefile (recv) on the socket. 107s try: 107s conn.request( 107s method, 107s url, 107s body=body, 107s headers=headers, 107s chunked=chunked, 107s preload_content=preload_content, 107s decode_content=decode_content, 107s enforce_content_length=enforce_content_length, 107s ) 107s 107s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 107s # legitimately able to close the connection after sending a valid response. 107s # With this behaviour, the received response is still readable. 107s except BrokenPipeError: 107s pass 107s except OSError as e: 107s # MacOS/Linux 107s # EPROTOTYPE and ECONNRESET are needed on macOS 107s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 107s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 107s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 107s raise 107s 107s # Reset the timeout for the recv() on the socket 107s read_timeout = timeout_obj.read_timeout 107s 107s if not conn.is_closed: 107s # In Python 3 socket.py will catch EAGAIN and return None when you 107s # try and read into the file pointer created by http.client, which 107s # instead raises a BadStatusLine exception. Instead of catching 107s # the exception and assuming all BadStatusLine exceptions are read 107s # timeouts, check for a zero timeout before making the request. 107s if read_timeout == 0: 107s raise ReadTimeoutError( 107s self, url, f"Read timed out. (read timeout={read_timeout})" 107s ) 107s conn.timeout = read_timeout 107s 107s # Receive the response from the server 107s try: 107s response = conn.getresponse() 107s except (BaseSSLError, OSError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 107s raise 107s 107s # Set properties that are used by the pooling layer. 107s response.retries = retries 107s response._connection = response_conn # type: ignore[attr-defined] 107s response._pool = self # type: ignore[attr-defined] 107s 107s log.debug( 107s '%s://%s:%s "%s %s %s" %s %s', 107s self.scheme, 107s self.host, 107s self.port, 107s method, 107s url, 107s > response.version_string, 107s response.status, 107s response.length_remaining, 107s ) 107s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 107s 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 107s ----------------------------- Captured stderr call ----------------------------- 107s 127.0.0.1 - - [19/Jan/2025 20:23:22] "GET / HTTP/1.1" 200 9358 107s _____________________________ test_domain_redirect _____________________________ 107s 107s def test_domain_redirect(): 107s """Ensure that redirects across domains are considered unique""" 107s # In this example, seomoz.org redirects to moz.com, and if those 107s # requests are considered identical, then we'll be stuck in a redirect 107s # loop. 107s url = "http://seomoz.org/" 107s with vcr.use_cassette("tests/fixtures/wild/domain_redirect.yaml") as cass: 107s > requests.get(url, headers={"User-Agent": "vcrpy-test"}) 107s 107s tests/integration/test_wild.py:20: 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s /usr/lib/python3/dist-packages/requests/api.py:73: in get 107s return request("get", url, params=params, **kwargs) 107s /usr/lib/python3/dist-packages/requests/api.py:59: in request 107s return session.request(method=method, url=url, **kwargs) 107s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 107s resp = self.send(prep, **send_kwargs) 107s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 107s r = adapter.send(request, **kwargs) 107s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 107s resp = conn.urlopen( 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 107s response = self._make_request( 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s 107s self = 107s conn = 107s method = 'GET', url = '/', body = None 107s headers = {'User-Agent': 'vcrpy-test', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 107s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 107s timeout = Timeout(connect=None, read=None, total=None), chunked = False 107s response_conn = 107s preload_content = False, decode_content = False, enforce_content_length = True 107s 107s def _make_request( 107s self, 107s conn: BaseHTTPConnection, 107s method: str, 107s url: str, 107s body: _TYPE_BODY | None = None, 107s headers: typing.Mapping[str, str] | None = None, 107s retries: Retry | None = None, 107s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 107s chunked: bool = False, 107s response_conn: BaseHTTPConnection | None = None, 107s preload_content: bool = True, 107s decode_content: bool = True, 107s enforce_content_length: bool = True, 107s ) -> BaseHTTPResponse: 107s """ 107s Perform a request on a given urllib connection object taken from our 107s pool. 107s 107s :param conn: 107s a connection from one of our connection pools 107s 107s :param method: 107s HTTP request method (such as GET, POST, PUT, etc.) 107s 107s :param url: 107s The URL to perform the request on. 107s 107s :param body: 107s Data to send in the request body, either :class:`str`, :class:`bytes`, 107s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 107s 107s :param headers: 107s Dictionary of custom headers to send, such as User-Agent, 107s If-None-Match, etc. If None, pool headers are used. If provided, 107s these headers completely replace any pool-specific headers. 107s 107s :param retries: 107s Configure the number of retries to allow before raising a 107s :class:`~urllib3.exceptions.MaxRetryError` exception. 107s 107s Pass ``None`` to retry until you receive a response. Pass a 107s :class:`~urllib3.util.retry.Retry` object for fine-grained control 107s over different types of retries. 107s Pass an integer number to retry connection errors that many times, 107s but no other types of errors. Pass zero to never retry. 107s 107s If ``False``, then retries are disabled and any exception is raised 107s immediately. Also, instead of raising a MaxRetryError on redirects, 107s the redirect response will be returned. 107s 107s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 107s 107s :param timeout: 107s If specified, overrides the default timeout for this one 107s request. It may be a float (in seconds) or an instance of 107s :class:`urllib3.util.Timeout`. 107s 107s :param chunked: 107s If True, urllib3 will send the body using chunked transfer 107s encoding. Otherwise, urllib3 will send the body using the standard 107s content-length form. Defaults to False. 107s 107s :param response_conn: 107s Set this to ``None`` if you will handle releasing the connection or 107s set the connection to have the response release it. 107s 107s :param preload_content: 107s If True, the response's body will be preloaded during construction. 107s 107s :param decode_content: 107s If True, will attempt to decode the body based on the 107s 'content-encoding' header. 107s 107s :param enforce_content_length: 107s Enforce content length checking. Body returned by server must match 107s value of Content-Length header, if present. Otherwise, raise error. 107s """ 107s self.num_requests += 1 107s 107s timeout_obj = self._get_timeout(timeout) 107s timeout_obj.start_connect() 107s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 107s 107s try: 107s # Trigger any extra validation we need to do. 107s try: 107s self._validate_conn(conn) 107s except (SocketTimeout, BaseSSLError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 107s raise 107s 107s # _validate_conn() starts the connection to an HTTPS proxy 107s # so we need to wrap errors with 'ProxyError' here too. 107s except ( 107s OSError, 107s NewConnectionError, 107s TimeoutError, 107s BaseSSLError, 107s CertificateError, 107s SSLError, 107s ) as e: 107s new_e: Exception = e 107s if isinstance(e, (BaseSSLError, CertificateError)): 107s new_e = SSLError(e) 107s # If the connection didn't successfully connect to it's proxy 107s # then there 107s if isinstance( 107s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 107s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 107s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 107s raise new_e 107s 107s # conn.request() calls http.client.*.request, not the method in 107s # urllib3.request. It also calls makefile (recv) on the socket. 107s try: 107s conn.request( 107s method, 107s url, 107s body=body, 107s headers=headers, 107s chunked=chunked, 107s preload_content=preload_content, 107s decode_content=decode_content, 107s enforce_content_length=enforce_content_length, 107s ) 107s 107s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 107s # legitimately able to close the connection after sending a valid response. 107s # With this behaviour, the received response is still readable. 107s except BrokenPipeError: 107s pass 107s except OSError as e: 107s # MacOS/Linux 107s # EPROTOTYPE and ECONNRESET are needed on macOS 107s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 107s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 107s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 107s raise 107s 107s # Reset the timeout for the recv() on the socket 107s read_timeout = timeout_obj.read_timeout 107s 107s if not conn.is_closed: 107s # In Python 3 socket.py will catch EAGAIN and return None when you 107s # try and read into the file pointer created by http.client, which 107s # instead raises a BadStatusLine exception. Instead of catching 107s # the exception and assuming all BadStatusLine exceptions are read 107s # timeouts, check for a zero timeout before making the request. 107s if read_timeout == 0: 107s raise ReadTimeoutError( 107s self, url, f"Read timed out. (read timeout={read_timeout})" 107s ) 107s conn.timeout = read_timeout 107s 107s # Receive the response from the server 107s try: 107s response = conn.getresponse() 107s except (BaseSSLError, OSError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 107s raise 107s 107s # Set properties that are used by the pooling layer. 107s response.retries = retries 107s response._connection = response_conn # type: ignore[attr-defined] 107s response._pool = self # type: ignore[attr-defined] 107s 107s log.debug( 107s '%s://%s:%s "%s %s %s" %s %s', 107s self.scheme, 107s self.host, 107s self.port, 107s method, 107s url, 107s > response.version_string, 107s response.status, 107s response.length_remaining, 107s ) 107s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 107s 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 107s _________________________________ test_cookies _________________________________ 107s 107s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_cookies0') 107s httpbin = 107s 107s def test_cookies(tmpdir, httpbin): 107s testfile = str(tmpdir.join("cookies.yml")) 107s with vcr.use_cassette(testfile): 107s with requests.Session() as s: 107s > s.get(httpbin.url + "/cookies/set?k1=v1&k2=v2") 107s 107s tests/integration/test_wild.py:67: 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 107s return self.request("GET", url, **kwargs) 107s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 107s resp = self.send(prep, **send_kwargs) 107s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 107s r = adapter.send(request, **kwargs) 107s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 107s resp = conn.urlopen( 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 107s response = self._make_request( 107s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 107s 107s self = 107s conn = 107s method = 'GET', url = '/cookies/set?k1=v1&k2=v2', body = None 107s headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 107s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 107s timeout = Timeout(connect=None, read=None, total=None), chunked = False 107s response_conn = 107s preload_content = False, decode_content = False, enforce_content_length = True 107s 107s def _make_request( 107s self, 107s conn: BaseHTTPConnection, 107s method: str, 107s url: str, 107s body: _TYPE_BODY | None = None, 107s headers: typing.Mapping[str, str] | None = None, 107s retries: Retry | None = None, 107s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 107s chunked: bool = False, 107s response_conn: BaseHTTPConnection | None = None, 107s preload_content: bool = True, 107s decode_content: bool = True, 107s enforce_content_length: bool = True, 107s ) -> BaseHTTPResponse: 107s """ 107s Perform a request on a given urllib connection object taken from our 107s pool. 107s 107s :param conn: 107s a connection from one of our connection pools 107s 107s :param method: 107s HTTP request method (such as GET, POST, PUT, etc.) 107s 107s :param url: 107s The URL to perform the request on. 107s 107s :param body: 107s Data to send in the request body, either :class:`str`, :class:`bytes`, 107s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 107s 107s :param headers: 107s Dictionary of custom headers to send, such as User-Agent, 107s If-None-Match, etc. If None, pool headers are used. If provided, 107s these headers completely replace any pool-specific headers. 107s 107s :param retries: 107s Configure the number of retries to allow before raising a 107s :class:`~urllib3.exceptions.MaxRetryError` exception. 107s 107s Pass ``None`` to retry until you receive a response. Pass a 107s :class:`~urllib3.util.retry.Retry` object for fine-grained control 107s over different types of retries. 107s Pass an integer number to retry connection errors that many times, 107s but no other types of errors. Pass zero to never retry. 107s 107s If ``False``, then retries are disabled and any exception is raised 107s immediately. Also, instead of raising a MaxRetryError on redirects, 107s the redirect response will be returned. 107s 107s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 107s 107s :param timeout: 107s If specified, overrides the default timeout for this one 107s request. It may be a float (in seconds) or an instance of 107s :class:`urllib3.util.Timeout`. 107s 107s :param chunked: 107s If True, urllib3 will send the body using chunked transfer 107s encoding. Otherwise, urllib3 will send the body using the standard 107s content-length form. Defaults to False. 107s 107s :param response_conn: 107s Set this to ``None`` if you will handle releasing the connection or 107s set the connection to have the response release it. 107s 107s :param preload_content: 107s If True, the response's body will be preloaded during construction. 107s 107s :param decode_content: 107s If True, will attempt to decode the body based on the 107s 'content-encoding' header. 107s 107s :param enforce_content_length: 107s Enforce content length checking. Body returned by server must match 107s value of Content-Length header, if present. Otherwise, raise error. 107s """ 107s self.num_requests += 1 107s 107s timeout_obj = self._get_timeout(timeout) 107s timeout_obj.start_connect() 107s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 107s 107s try: 107s # Trigger any extra validation we need to do. 107s try: 107s self._validate_conn(conn) 107s except (SocketTimeout, BaseSSLError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 107s raise 107s 107s # _validate_conn() starts the connection to an HTTPS proxy 107s # so we need to wrap errors with 'ProxyError' here too. 107s except ( 107s OSError, 107s NewConnectionError, 107s TimeoutError, 107s BaseSSLError, 107s CertificateError, 107s SSLError, 107s ) as e: 107s new_e: Exception = e 107s if isinstance(e, (BaseSSLError, CertificateError)): 107s new_e = SSLError(e) 107s # If the connection didn't successfully connect to it's proxy 107s # then there 107s if isinstance( 107s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 107s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 107s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 107s raise new_e 107s 107s # conn.request() calls http.client.*.request, not the method in 107s # urllib3.request. It also calls makefile (recv) on the socket. 107s try: 107s conn.request( 107s method, 107s url, 107s body=body, 107s headers=headers, 107s chunked=chunked, 107s preload_content=preload_content, 107s decode_content=decode_content, 107s enforce_content_length=enforce_content_length, 107s ) 107s 107s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 107s # legitimately able to close the connection after sending a valid response. 107s # With this behaviour, the received response is still readable. 107s except BrokenPipeError: 107s pass 107s except OSError as e: 107s # MacOS/Linux 107s # EPROTOTYPE and ECONNRESET are needed on macOS 107s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 107s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 107s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 107s raise 107s 107s # Reset the timeout for the recv() on the socket 107s read_timeout = timeout_obj.read_timeout 107s 107s if not conn.is_closed: 107s # In Python 3 socket.py will catch EAGAIN and return None when you 107s # try and read into the file pointer created by http.client, which 107s # instead raises a BadStatusLine exception. Instead of catching 107s # the exception and assuming all BadStatusLine exceptions are read 107s # timeouts, check for a zero timeout before making the request. 107s if read_timeout == 0: 107s raise ReadTimeoutError( 107s self, url, f"Read timed out. (read timeout={read_timeout})" 107s ) 107s conn.timeout = read_timeout 107s 107s # Receive the response from the server 107s try: 107s response = conn.getresponse() 107s except (BaseSSLError, OSError) as e: 107s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 107s raise 107s 107s # Set properties that are used by the pooling layer. 107s response.retries = retries 107s response._connection = response_conn # type: ignore[attr-defined] 107s response._pool = self # type: ignore[attr-defined] 107s 107s log.debug( 107s '%s://%s:%s "%s %s %s" %s %s', 107s self.scheme, 107s self.host, 107s self.port, 107s method, 107s url,E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build; python3.12 -m pytest --ignore tests/integration/test_aiohttp.py --ignore tests/integration/test_tornado.py --ignore tests/integration/test_requests.py -m "not online" -k "not test_basic_json_use and not test_load_cassette_with_custom_persister" 107s pybuild-autopkgtest: error: pybuild --autopkgtest --test-pytest -i python{version} -p "3.13 3.12" returned exit code 13 107s make: *** [/tmp/YRn_e4U2Kh/run:4: pybuild-autopkgtest] Error 25 107s pybuild-autopkgtest: error: /tmp/YRn_e4U2Kh/run pybuild-autopkgtest returned exit code 2 107s 107s > response.version_string, 107s response.status, 107s response.length_remaining, 107s ) 107s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 107s 107s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 107s ----------------------------- Captured stderr call ----------------------------- 107s 127.0.0.1 - - [19/Jan/2025 20:23:22] "GET /cookies/set?k1=v1&k2=v2 HTTP/1.1" 302 203 107s =============================== warnings summary =============================== 107s tests/integration/test_config.py:10 107s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_config.py:10: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 107s @pytest.mark.online 107s 107s tests/integration/test_config.py:24 107s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_config.py:24: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 107s @pytest.mark.online 107s 107s tests/integration/test_config.py:34 107s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_config.py:34: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 107s @pytest.mark.online 107s 107s tests/integration/test_config.py:47 107s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_config.py:47: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 107s @pytest.mark.online 107s 107s tests/integration/test_config.py:69 107s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_config.py:69: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 107s @pytest.mark.online 107s 107s tests/integration/test_disksaver.py:14 107s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_disksaver.py:14: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 107s @pytest.mark.online 107s 107s tests/integration/test_disksaver.py:35 107s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_disksaver.py:35: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 107s @pytest.mark.online 107s 107s tests/integration/test_httplib2.py:60 107s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_httplib2.py:60: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 107s @pytest.mark.online 107s 107s tests/integration/test_register_matcher.py:16 107s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_register_matcher.py:16: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 107s @pytest.mark.online 107s 107s tests/integration/test_register_matcher.py:32 107s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_register_matcher.py:32: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 107s @pytest.mark.online 107s 107s tests/integration/test_urllib2.py:60 107s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_urllib2.py:60: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 107s @mark.online 107s 107s tests/integration/test_urllib3.py:102 107s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_urllib3.py:102: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 107s @pytest.mark.online 107s 107s tests/integration/test_wild.py:55 107s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_wild.py:55: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 107s @pytest.mark.online 107s 107s tests/integration/test_wild.py:74 107s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/integration/test_wild.py:74: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 107s @pytest.mark.online 107s 107s tests/unit/test_stubs.py:20 107s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/unit/test_stubs.py:20: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 107s @mark.online 107s 107s tests/unit/test_unittest.py:131 107s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/unit/test_unittest.py:131: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 107s @pytest.mark.online 107s 107s tests/unit/test_unittest.py:166 107s /tmp/autopkgtest.eenLbd/autopkgtest_tmp/build/tests/unit/test_unittest.py:166: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 107s @pytest.mark.online 107s 107s tests/integration/test_wild.py::test_xmlrpclib 107s /usr/lib/python3.12/multiprocessing/popen_fork.py:66: DeprecationWarning: This process (pid=3914) is multi-threaded, use of fork() may lead to deadlocks in the child. 107s self.pid = os.fork() 107s 107s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 107s =========================== short test summary info ============================ 107s FAILED tests/integration/test_urllib3.py::test_status_code[http] - AttributeE... 107s FAILED tests/integration/test_urllib3.py::test_headers[http] - AttributeError... 107s FAILED tests/integration/test_urllib3.py::test_body[http] - AttributeError: '... 107s FAILED tests/integration/test_urllib3.py::test_auth[http] - AttributeError: '... 107s FAILED tests/integration/test_urllib3.py::test_auth_failed[http] - AttributeE... 107s FAILED tests/integration/test_urllib3.py::test_post[http] - AttributeError: '... 107s FAILED tests/integration/test_urllib3.py::test_gzip[http] - AttributeError: '... 107s FAILED tests/integration/test_urllib3.py::test_status_code[https] - Attribute... 107s FAILED tests/integration/test_urllib3.py::test_headers[https] - AttributeErro... 107s FAILED tests/integration/test_urllib3.py::test_body[https] - AttributeError: ... 107s FAILED tests/integration/test_urllib3.py::test_auth[https] - AttributeError: ... 107s FAILED tests/integration/test_urllib3.py::test_auth_failed[https] - Attribute... 107s FAILED tests/integration/test_urllib3.py::test_post[https] - AttributeError: ... 107s FAILED tests/integration/test_urllib3.py::test_gzip[https] - AttributeError: ... 107s FAILED tests/integration/test_proxy.py::test_use_proxy - AttributeError: 'VCR... 107s FAILED tests/integration/test_urllib3.py::test_cross_scheme - AttributeError:... 107s FAILED tests/integration/test_urllib3.py::test_https_with_cert_validation_disabled 107s FAILED tests/integration/test_wild.py::test_domain_redirect - AttributeError:... 107s FAILED tests/integration/test_wild.py::test_cookies - AttributeError: 'VCRHTT... 107s ==== 19 failed, 265 passed, 3 skipped, 19 deselected, 18 warnings in 3.20s ===== 108s autopkgtest [20:23:24]: test pybuild-autopkgtest: -----------------------] 108s pybuild-autopkgtest FAIL non-zero exit status 25 108s autopkgtest [20:23:24]: test pybuild-autopkgtest: - - - - - - - - - - results - - - - - - - - - - 109s autopkgtest [20:23:25]: @@@@@@@@@@@@@@@@@@@@ summary 109s pybuild-autopkgtest FAIL non-zero exit status 25 113s nova [W] Skipping flock for amd64 113s Creating nova instance adt-plucky-amd64-vcr.py-20250119-202136-juju-7f2275-prod-proposed-migration-environment-2-cda10e07-16b2-4784-9314-63314e3265f8 from image adt/ubuntu-plucky-amd64-server-20250119.img (UUID 7982e7e7-53fc-4a89-b206-09501ed3ffd2)... 113s nova [W] Timed out waiting for dbed9699-3418-4d4d-b162-84d4ce4515db to get deleted.