0s autopkgtest [03:22:54]: starting date and time: 2025-01-18 03:22:54+0000 0s autopkgtest [03:22:54]: git checkout: 325255d2 Merge branch 'pin-any-arch' into 'ubuntu/production' 0s autopkgtest [03:22:54]: host juju-7f2275-prod-proposed-migration-environment-20; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.5613sq8s/out --timeout-copy=6000 --setup-commands /home/ubuntu/autopkgtest-cloud/worker-config-production/setup-canonical.sh --apt-pocket=proposed=src:python-urllib3 --apt-upgrade vcr.py --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 --env=ADT_TEST_TRIGGERS=python-urllib3/2.3.0-1 -- ssh -s /home/ubuntu/autopkgtest/ssh-setup/nova -- --flavor autopkgtest-ppc64el --security-groups autopkgtest-juju-7f2275-prod-proposed-migration-environment-20@bos03-ppc64el-10.secgroup --name adt-plucky-ppc64el-vcr.py-20250118-032254-juju-7f2275-prod-proposed-migration-environment-20-31956513-6df4-4cc1-9d52-02a17818ef4e --image adt/ubuntu-plucky-ppc64el-server --keyname testbed-juju-7f2275-prod-proposed-migration-environment-20 --net-id=net_prod-proposed-migration-ppc64el -e TERM=linux -e ''"'"'http_proxy=http://squid.internal:3128'"'"'' -e ''"'"'https_proxy=http://squid.internal:3128'"'"'' -e ''"'"'no_proxy=127.0.0.1,127.0.1.1,login.ubuntu.com,localhost,localdomain,novalocal,internal,archive.ubuntu.com,ports.ubuntu.com,security.ubuntu.com,ddebs.ubuntu.com,changelogs.ubuntu.com,keyserver.ubuntu.com,launchpadlibrarian.net,launchpadcontent.net,launchpad.net,10.24.0.0/24,keystone.ps5.canonical.com,objectstorage.prodstack5.canonical.com,radosgw.ps5.canonical.com'"'"'' --mirror=http://ftpmaster.internal/ubuntu/ 97s autopkgtest [03:24:31]: testbed dpkg architecture: ppc64el 97s autopkgtest [03:24:31]: testbed apt version: 2.9.18 97s autopkgtest [03:24:31]: @@@@@@@@@@@@@@@@@@@@ test bed setup 98s autopkgtest [03:24:32]: testbed release detected to be: None 98s autopkgtest [03:24:32]: updating testbed package index (apt update) 99s Get:1 http://ftpmaster.internal/ubuntu plucky-proposed InRelease [73.9 kB] 99s Hit:2 http://ftpmaster.internal/ubuntu plucky InRelease 99s Hit:3 http://ftpmaster.internal/ubuntu plucky-updates InRelease 99s Hit:4 http://ftpmaster.internal/ubuntu plucky-security InRelease 99s Get:5 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse Sources [15.3 kB] 99s Get:6 http://ftpmaster.internal/ubuntu plucky-proposed/main Sources [156 kB] 99s Get:7 http://ftpmaster.internal/ubuntu plucky-proposed/restricted Sources [9708 B] 99s Get:8 http://ftpmaster.internal/ubuntu plucky-proposed/universe Sources [838 kB] 99s Get:9 http://ftpmaster.internal/ubuntu plucky-proposed/main ppc64el Packages [266 kB] 100s Get:10 http://ftpmaster.internal/ubuntu plucky-proposed/restricted ppc64el Packages [756 B] 100s Get:11 http://ftpmaster.internal/ubuntu plucky-proposed/universe ppc64el Packages [960 kB] 100s Get:12 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse ppc64el Packages [14.8 kB] 100s Fetched 2335 kB in 1s (1754 kB/s) 101s Reading package lists... 102s Reading package lists... 102s Building dependency tree... 102s Reading state information... 103s Calculating upgrade... 103s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 103s Reading package lists... 103s Building dependency tree... 103s Reading state information... 103s 0 upgraded, 0 newly installed, 0 to remove and 1 not upgraded. 103s autopkgtest [03:24:37]: upgrading testbed (apt dist-upgrade and autopurge) 103s Reading package lists... 104s Building dependency tree... 104s Reading state information... 104s Calculating upgrade...Starting pkgProblemResolver with broken count: 0 104s Starting 2 pkgProblemResolver with broken count: 0 105s Done 105s Entering ResolveByKeep 105s 105s The following packages will be upgraded: 105s python3-urllib3 106s 1 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 106s Need to get 94.0 kB of archives. 106s After this operation, 18.4 kB of additional disk space will be used. 106s Get:1 http://ftpmaster.internal/ubuntu plucky-proposed/main ppc64el python3-urllib3 all 2.3.0-1 [94.0 kB] 106s Fetched 94.0 kB in 0s (271 kB/s) 107s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 74042 files and directories currently installed.) 107s Preparing to unpack .../python3-urllib3_2.3.0-1_all.deb ... 107s Unpacking python3-urllib3 (2.3.0-1) over (2.0.7-2ubuntu0.1) ... 107s Setting up python3-urllib3 (2.3.0-1) ... 108s Reading package lists... 108s Building dependency tree... 108s Reading state information... 108s Starting pkgProblemResolver with broken count: 0 108s Starting 2 pkgProblemResolver with broken count: 0 108s Done 108s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 111s autopkgtest [03:24:45]: testbed running kernel: Linux 6.11.0-8-generic #8-Ubuntu SMP Mon Sep 16 13:49:23 UTC 2024 111s autopkgtest [03:24:45]: @@@@@@@@@@@@@@@@@@@@ apt-source vcr.py 114s Get:1 http://ftpmaster.internal/ubuntu plucky/universe vcr.py 6.0.2-2 (dsc) [2977 B] 114s Get:2 http://ftpmaster.internal/ubuntu plucky/universe vcr.py 6.0.2-2 (tar) [339 kB] 114s Get:3 http://ftpmaster.internal/ubuntu plucky/universe vcr.py 6.0.2-2 (diff) [6348 B] 114s gpgv: Signature made Tue Dec 17 14:55:48 2024 UTC 114s gpgv: using RSA key AC0A4FF12611B6FCCF01C111393587D97D86500B 114s gpgv: Can't check signature: No public key 114s dpkg-source: warning: cannot verify inline signature for ./vcr.py_6.0.2-2.dsc: no acceptable signature found 115s autopkgtest [03:24:49]: testing package vcr.py version 6.0.2-2 116s autopkgtest [03:24:50]: build not needed 117s autopkgtest [03:24:51]: test pybuild-autopkgtest: preparing testbed 117s Reading package lists... 117s Building dependency tree... 117s Reading state information... 117s Starting pkgProblemResolver with broken count: 0 117s Starting 2 pkgProblemResolver with broken count: 0 117s Done 118s The following NEW packages will be installed: 118s autoconf automake autopoint autotools-dev build-essential cpp cpp-14 118s cpp-14-powerpc64le-linux-gnu cpp-powerpc64le-linux-gnu debhelper debugedit 118s dh-autoreconf dh-python dh-strip-nondeterminism docutils-common dwz 118s fonts-font-awesome fonts-lato g++ g++-14 g++-14-powerpc64le-linux-gnu 118s g++-powerpc64le-linux-gnu gcc gcc-14 gcc-14-powerpc64le-linux-gnu 118s gcc-powerpc64le-linux-gnu gettext intltool-debian libarchive-zip-perl 118s libasan8 libcc1-0 libdebhelper-perl libfile-stripnondeterminism-perl 118s libgcc-14-dev libgomp1 libisl23 libitm1 libjs-jquery libjs-sphinxdoc 118s libjs-underscore libjson-perl liblsan0 liblua5.4-0 libmpc3 118s libpython3.13-minimal libpython3.13-stdlib libquadmath0 libstdc++-14-dev 118s libtool libtsan2 libubsan1 m4 pandoc pandoc-data po-debconf 118s pybuild-plugin-autopkgtest python-vcr-doc python3-aiohappyeyeballs 118s python3-aiohttp python3-aiosignal python3-alabaster python3-all 118s python3-async-timeout python3-boto3 python3-botocore python3-brotli 118s python3-brotlicffi python3-click python3-dateutil python3-decorator 118s python3-defusedxml python3-docutils python3-flasgger python3-flask 118s python3-frozenlist python3-greenlet python3-httpbin python3-imagesize 118s python3-iniconfig python3-itsdangerous python3-jmespath python3-mistune 118s python3-multidict python3-packaging python3-pluggy python3-pytest 118s python3-pytest-httpbin python3-pytest-tornado python3-roman 118s python3-s3transfer python3-six python3-snowballstemmer python3-sphinx 118s python3-sphinx-rtd-theme python3-sphinxcontrib.jquery python3-tornado 118s python3-vcr python3-werkzeug python3-wrapt python3-yarl python3.13 118s python3.13-minimal sgml-base sphinx-common sphinx-rtd-theme-common xml-core 118s 0 upgraded, 106 newly installed, 0 to remove and 0 not upgraded. 118s Need to get 117 MB of archives. 118s After this operation, 679 MB of additional disk space will be used. 118s Get:1 http://ftpmaster.internal/ubuntu plucky/main ppc64el fonts-lato all 2.015-1 [2781 kB] 119s Get:2 http://ftpmaster.internal/ubuntu plucky/main ppc64el libpython3.13-minimal ppc64el 3.13.1-2 [883 kB] 119s Get:3 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3.13-minimal ppc64el 3.13.1-2 [2496 kB] 119s Get:4 http://ftpmaster.internal/ubuntu plucky/main ppc64el sgml-base all 1.31 [11.4 kB] 119s Get:5 http://ftpmaster.internal/ubuntu plucky/main ppc64el m4 ppc64el 1.4.19-4build1 [278 kB] 119s Get:6 http://ftpmaster.internal/ubuntu plucky/main ppc64el autoconf all 2.72-3 [382 kB] 119s Get:7 http://ftpmaster.internal/ubuntu plucky/main ppc64el autotools-dev all 20220109.1 [44.9 kB] 119s Get:8 http://ftpmaster.internal/ubuntu plucky/main ppc64el automake all 1:1.16.5-1.3ubuntu1 [558 kB] 119s Get:9 http://ftpmaster.internal/ubuntu plucky/main ppc64el autopoint all 0.22.5-3 [616 kB] 119s Get:10 http://ftpmaster.internal/ubuntu plucky/main ppc64el libisl23 ppc64el 0.27-1 [882 kB] 119s Get:11 http://ftpmaster.internal/ubuntu plucky/main ppc64el libmpc3 ppc64el 1.3.1-1build2 [62.1 kB] 119s Get:12 http://ftpmaster.internal/ubuntu plucky/main ppc64el cpp-14-powerpc64le-linux-gnu ppc64el 14.2.0-13ubuntu1 [10.5 MB] 119s Get:13 http://ftpmaster.internal/ubuntu plucky/main ppc64el cpp-14 ppc64el 14.2.0-13ubuntu1 [1036 B] 119s Get:14 http://ftpmaster.internal/ubuntu plucky/main ppc64el cpp-powerpc64le-linux-gnu ppc64el 4:14.1.0-2ubuntu1 [5456 B] 119s Get:15 http://ftpmaster.internal/ubuntu plucky/main ppc64el cpp ppc64el 4:14.1.0-2ubuntu1 [22.5 kB] 119s Get:16 http://ftpmaster.internal/ubuntu plucky/main ppc64el libcc1-0 ppc64el 14.2.0-13ubuntu1 [48.1 kB] 119s Get:17 http://ftpmaster.internal/ubuntu plucky/main ppc64el libgomp1 ppc64el 14.2.0-13ubuntu1 [161 kB] 119s Get:18 http://ftpmaster.internal/ubuntu plucky/main ppc64el libitm1 ppc64el 14.2.0-13ubuntu1 [32.2 kB] 119s Get:19 http://ftpmaster.internal/ubuntu plucky/main ppc64el libasan8 ppc64el 14.2.0-13ubuntu1 [2945 kB] 119s Get:20 http://ftpmaster.internal/ubuntu plucky/main ppc64el liblsan0 ppc64el 14.2.0-13ubuntu1 [1322 kB] 119s Get:21 http://ftpmaster.internal/ubuntu plucky/main ppc64el libtsan2 ppc64el 14.2.0-13ubuntu1 [2695 kB] 119s Get:22 http://ftpmaster.internal/ubuntu plucky/main ppc64el libubsan1 ppc64el 14.2.0-13ubuntu1 [1191 kB] 119s Get:23 http://ftpmaster.internal/ubuntu plucky/main ppc64el libquadmath0 ppc64el 14.2.0-13ubuntu1 [158 kB] 119s Get:24 http://ftpmaster.internal/ubuntu plucky/main ppc64el libgcc-14-dev ppc64el 14.2.0-13ubuntu1 [1620 kB] 119s Get:25 http://ftpmaster.internal/ubuntu plucky/main ppc64el gcc-14-powerpc64le-linux-gnu ppc64el 14.2.0-13ubuntu1 [20.6 MB] 120s Get:26 http://ftpmaster.internal/ubuntu plucky/main ppc64el gcc-14 ppc64el 14.2.0-13ubuntu1 [534 kB] 120s Get:27 http://ftpmaster.internal/ubuntu plucky/main ppc64el gcc-powerpc64le-linux-gnu ppc64el 4:14.1.0-2ubuntu1 [1222 B] 120s Get:28 http://ftpmaster.internal/ubuntu plucky/main ppc64el gcc ppc64el 4:14.1.0-2ubuntu1 [5006 B] 120s Get:29 http://ftpmaster.internal/ubuntu plucky/main ppc64el libstdc++-14-dev ppc64el 14.2.0-13ubuntu1 [2677 kB] 120s Get:30 http://ftpmaster.internal/ubuntu plucky/main ppc64el g++-14-powerpc64le-linux-gnu ppc64el 14.2.0-13ubuntu1 [12.0 MB] 121s Get:31 http://ftpmaster.internal/ubuntu plucky/main ppc64el g++-14 ppc64el 14.2.0-13ubuntu1 [21.1 kB] 121s Get:32 http://ftpmaster.internal/ubuntu plucky/main ppc64el g++-powerpc64le-linux-gnu ppc64el 4:14.1.0-2ubuntu1 [968 B] 121s Get:33 http://ftpmaster.internal/ubuntu plucky/main ppc64el g++ ppc64el 4:14.1.0-2ubuntu1 [1090 B] 121s Get:34 http://ftpmaster.internal/ubuntu plucky/main ppc64el build-essential ppc64el 12.10ubuntu1 [4936 B] 121s Get:35 http://ftpmaster.internal/ubuntu plucky/main ppc64el libdebhelper-perl all 13.20ubuntu1 [94.2 kB] 121s Get:36 http://ftpmaster.internal/ubuntu plucky/main ppc64el libtool all 2.4.7-8 [166 kB] 121s Get:37 http://ftpmaster.internal/ubuntu plucky/main ppc64el dh-autoreconf all 20 [16.1 kB] 121s Get:38 http://ftpmaster.internal/ubuntu plucky/main ppc64el libarchive-zip-perl all 1.68-1 [90.2 kB] 121s Get:39 http://ftpmaster.internal/ubuntu plucky/main ppc64el libfile-stripnondeterminism-perl all 1.14.0-1 [20.1 kB] 121s Get:40 http://ftpmaster.internal/ubuntu plucky/main ppc64el dh-strip-nondeterminism all 1.14.0-1 [5058 B] 121s Get:41 http://ftpmaster.internal/ubuntu plucky/main ppc64el debugedit ppc64el 1:5.1-1 [52.1 kB] 121s Get:42 http://ftpmaster.internal/ubuntu plucky/main ppc64el dwz ppc64el 0.15-1build6 [142 kB] 121s Get:43 http://ftpmaster.internal/ubuntu plucky/main ppc64el gettext ppc64el 0.22.5-3 [1083 kB] 121s Get:44 http://ftpmaster.internal/ubuntu plucky/main ppc64el intltool-debian all 0.35.0+20060710.6 [23.2 kB] 121s Get:45 http://ftpmaster.internal/ubuntu plucky/main ppc64el po-debconf all 1.0.21+nmu1 [233 kB] 121s Get:46 http://ftpmaster.internal/ubuntu plucky/main ppc64el debhelper all 13.20ubuntu1 [893 kB] 121s Get:47 http://ftpmaster.internal/ubuntu plucky/universe ppc64el dh-python all 6.20241217 [117 kB] 121s Get:48 http://ftpmaster.internal/ubuntu plucky/main ppc64el xml-core all 0.19 [20.3 kB] 121s Get:49 http://ftpmaster.internal/ubuntu plucky/main ppc64el docutils-common all 0.21.2+dfsg-2 [131 kB] 121s Get:50 http://ftpmaster.internal/ubuntu plucky/main ppc64el fonts-font-awesome all 5.0.10+really4.7.0~dfsg-4.1 [516 kB] 121s Get:51 http://ftpmaster.internal/ubuntu plucky/main ppc64el libjs-jquery all 3.6.1+dfsg+~3.5.14-1 [328 kB] 121s Get:52 http://ftpmaster.internal/ubuntu plucky/main ppc64el libjs-underscore all 1.13.4~dfsg+~1.11.4-3 [118 kB] 121s Get:53 http://ftpmaster.internal/ubuntu plucky/main ppc64el libjs-sphinxdoc all 8.1.3-3 [30.9 kB] 121s Get:54 http://ftpmaster.internal/ubuntu plucky/main ppc64el libjson-perl all 4.10000-1 [81.9 kB] 121s Get:55 http://ftpmaster.internal/ubuntu plucky/main ppc64el liblua5.4-0 ppc64el 5.4.7-1 [211 kB] 121s Get:56 http://ftpmaster.internal/ubuntu plucky/main ppc64el libpython3.13-stdlib ppc64el 3.13.1-2 [2131 kB] 121s Get:57 http://ftpmaster.internal/ubuntu plucky/universe ppc64el pandoc-data all 3.1.11.1-3build1 [78.8 kB] 121s Get:58 http://ftpmaster.internal/ubuntu plucky/universe ppc64el pandoc ppc64el 3.1.11.1+ds-2 [30.4 MB] 122s Get:59 http://ftpmaster.internal/ubuntu plucky/universe ppc64el pybuild-plugin-autopkgtest all 6.20241217 [1746 B] 122s Get:60 http://ftpmaster.internal/ubuntu plucky/universe ppc64el python-vcr-doc all 6.0.2-2 [184 kB] 122s Get:61 http://ftpmaster.internal/ubuntu plucky/universe ppc64el python3-aiohappyeyeballs all 2.4.4-2 [10.6 kB] 122s Get:62 http://ftpmaster.internal/ubuntu plucky/universe ppc64el python3-multidict ppc64el 6.1.0-1build1 [40.2 kB] 122s Get:63 http://ftpmaster.internal/ubuntu plucky/universe ppc64el python3-yarl ppc64el 1.13.1-1build1 [129 kB] 122s Get:64 http://ftpmaster.internal/ubuntu plucky/universe ppc64el python3-async-timeout all 5.0.1-1 [6830 B] 122s Get:65 http://ftpmaster.internal/ubuntu plucky/universe ppc64el python3-frozenlist ppc64el 1.5.0-1build1 [68.2 kB] 122s Get:66 http://ftpmaster.internal/ubuntu plucky/universe ppc64el python3-aiosignal all 1.3.2-1 [5182 B] 122s Get:67 http://ftpmaster.internal/ubuntu plucky/universe ppc64el python3-aiohttp ppc64el 3.10.11-1 [375 kB] 122s Get:68 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3.13 ppc64el 3.13.1-2 [729 kB] 122s Get:69 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3-all ppc64el 3.12.8-1 [892 B] 122s Get:70 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3-dateutil all 2.9.0-3 [80.2 kB] 122s Get:71 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3-jmespath all 1.0.1-1 [21.3 kB] 122s Get:72 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3-six all 1.17.0-1 [13.2 kB] 122s Get:73 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3-botocore all 1.34.46+repack-1ubuntu1 [6211 kB] 122s Get:74 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3-s3transfer all 0.10.1-1ubuntu2 [54.3 kB] 122s Get:75 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3-boto3 all 1.34.46+dfsg-1ubuntu1 [72.5 kB] 122s Get:76 http://ftpmaster.internal/ubuntu plucky/universe ppc64el python3-brotli ppc64el 1.1.0-2build3 [423 kB] 122s Get:77 http://ftpmaster.internal/ubuntu plucky/universe ppc64el python3-brotlicffi ppc64el 1.1.0.0+ds1-1 [19.0 kB] 122s Get:78 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3-click all 8.1.8-1 [79.8 kB] 122s Get:79 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3-decorator all 5.1.1-5 [10.1 kB] 122s Get:80 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3-defusedxml all 0.7.1-3 [42.2 kB] 122s Get:81 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3-roman all 4.2-1 [10.0 kB] 122s Get:82 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3-docutils all 0.21.2+dfsg-2 [409 kB] 122s Get:83 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3-itsdangerous all 2.2.0-1 [15.2 kB] 122s Get:84 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3-werkzeug all 3.1.3-2 [169 kB] 122s Get:85 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3-flask all 3.1.0-2ubuntu1 [84.4 kB] 122s Get:86 http://ftpmaster.internal/ubuntu plucky/universe ppc64el python3-mistune all 3.0.2-2 [32.9 kB] 122s Get:87 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3-packaging all 24.2-1 [51.5 kB] 122s Get:88 http://ftpmaster.internal/ubuntu plucky/universe ppc64el python3-flasgger all 0.9.7.2~dev2+dfsg-3 [1693 kB] 122s Get:89 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3-greenlet ppc64el 3.1.0-1 [184 kB] 122s Get:90 http://ftpmaster.internal/ubuntu plucky/universe ppc64el python3-httpbin all 0.10.2+dfsg-2 [89.0 kB] 122s Get:91 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3-imagesize all 1.4.1-1 [6844 B] 122s Get:92 http://ftpmaster.internal/ubuntu plucky/universe ppc64el python3-iniconfig all 1.1.1-2 [6024 B] 122s Get:93 http://ftpmaster.internal/ubuntu plucky/universe ppc64el python3-pluggy all 1.5.0-1 [21.0 kB] 122s Get:94 http://ftpmaster.internal/ubuntu plucky/universe ppc64el python3-pytest all 8.3.4-1 [252 kB] 122s Get:95 http://ftpmaster.internal/ubuntu plucky/universe ppc64el python3-pytest-httpbin all 2.1.0-1 [13.0 kB] 122s Get:96 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3-tornado ppc64el 6.4.1-3 [299 kB] 122s Get:97 http://ftpmaster.internal/ubuntu plucky/universe ppc64el python3-pytest-tornado all 0.8.1-3 [7180 B] 122s Get:98 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3-snowballstemmer all 2.2.0-4build1 [59.8 kB] 122s Get:99 http://ftpmaster.internal/ubuntu plucky/main ppc64el sphinx-common all 8.1.3-3 [661 kB] 122s Get:100 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3-alabaster all 0.7.16-0.1 [18.5 kB] 122s Get:101 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3-sphinx all 8.1.3-3 [474 kB] 123s Get:102 http://ftpmaster.internal/ubuntu plucky/main ppc64el sphinx-rtd-theme-common all 3.0.2+dfsg-1 [1014 kB] 123s Get:103 http://ftpmaster.internal/ubuntu plucky/universe ppc64el python3-sphinxcontrib.jquery all 4.1-5 [6678 B] 123s Get:104 http://ftpmaster.internal/ubuntu plucky/universe ppc64el python3-sphinx-rtd-theme all 3.0.2+dfsg-1 [23.5 kB] 123s Get:105 http://ftpmaster.internal/ubuntu plucky/main ppc64el python3-wrapt ppc64el 1.15.0-4 [35.8 kB] 123s Get:106 http://ftpmaster.internal/ubuntu plucky/universe ppc64el python3-vcr all 6.0.2-2 [33.0 kB] 124s Fetched 117 MB in 5s (22.6 MB/s) 124s Selecting previously unselected package fonts-lato. 124s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 74048 files and directories currently installed.) 124s Preparing to unpack .../000-fonts-lato_2.015-1_all.deb ... 124s Unpacking fonts-lato (2.015-1) ... 124s Selecting previously unselected package libpython3.13-minimal:ppc64el. 124s Preparing to unpack .../001-libpython3.13-minimal_3.13.1-2_ppc64el.deb ... 124s Unpacking libpython3.13-minimal:ppc64el (3.13.1-2) ... 124s Selecting previously unselected package python3.13-minimal. 124s Preparing to unpack .../002-python3.13-minimal_3.13.1-2_ppc64el.deb ... 124s Unpacking python3.13-minimal (3.13.1-2) ... 124s Selecting previously unselected package sgml-base. 124s Preparing to unpack .../003-sgml-base_1.31_all.deb ... 124s Unpacking sgml-base (1.31) ... 124s Selecting previously unselected package m4. 124s Preparing to unpack .../004-m4_1.4.19-4build1_ppc64el.deb ... 124s Unpacking m4 (1.4.19-4build1) ... 124s Selecting previously unselected package autoconf. 124s Preparing to unpack .../005-autoconf_2.72-3_all.deb ... 124s Unpacking autoconf (2.72-3) ... 124s Selecting previously unselected package autotools-dev. 124s Preparing to unpack .../006-autotools-dev_20220109.1_all.deb ... 124s Unpacking autotools-dev (20220109.1) ... 124s Selecting previously unselected package automake. 124s Preparing to unpack .../007-automake_1%3a1.16.5-1.3ubuntu1_all.deb ... 124s Unpacking automake (1:1.16.5-1.3ubuntu1) ... 125s Selecting previously unselected package autopoint. 125s Preparing to unpack .../008-autopoint_0.22.5-3_all.deb ... 125s Unpacking autopoint (0.22.5-3) ... 125s Selecting previously unselected package libisl23:ppc64el. 125s Preparing to unpack .../009-libisl23_0.27-1_ppc64el.deb ... 125s Unpacking libisl23:ppc64el (0.27-1) ... 125s Selecting previously unselected package libmpc3:ppc64el. 125s Preparing to unpack .../010-libmpc3_1.3.1-1build2_ppc64el.deb ... 125s Unpacking libmpc3:ppc64el (1.3.1-1build2) ... 125s Selecting previously unselected package cpp-14-powerpc64le-linux-gnu. 125s Preparing to unpack .../011-cpp-14-powerpc64le-linux-gnu_14.2.0-13ubuntu1_ppc64el.deb ... 125s Unpacking cpp-14-powerpc64le-linux-gnu (14.2.0-13ubuntu1) ... 125s Selecting previously unselected package cpp-14. 125s Preparing to unpack .../012-cpp-14_14.2.0-13ubuntu1_ppc64el.deb ... 125s Unpacking cpp-14 (14.2.0-13ubuntu1) ... 125s Selecting previously unselected package cpp-powerpc64le-linux-gnu. 125s Preparing to unpack .../013-cpp-powerpc64le-linux-gnu_4%3a14.1.0-2ubuntu1_ppc64el.deb ... 125s Unpacking cpp-powerpc64le-linux-gnu (4:14.1.0-2ubuntu1) ... 125s Selecting previously unselected package cpp. 125s Preparing to unpack .../014-cpp_4%3a14.1.0-2ubuntu1_ppc64el.deb ... 125s Unpacking cpp (4:14.1.0-2ubuntu1) ... 125s Selecting previously unselected package libcc1-0:ppc64el. 125s Preparing to unpack .../015-libcc1-0_14.2.0-13ubuntu1_ppc64el.deb ... 125s Unpacking libcc1-0:ppc64el (14.2.0-13ubuntu1) ... 125s Selecting previously unselected package libgomp1:ppc64el. 125s Preparing to unpack .../016-libgomp1_14.2.0-13ubuntu1_ppc64el.deb ... 125s Unpacking libgomp1:ppc64el (14.2.0-13ubuntu1) ... 125s Selecting previously unselected package libitm1:ppc64el. 125s Preparing to unpack .../017-libitm1_14.2.0-13ubuntu1_ppc64el.deb ... 125s Unpacking libitm1:ppc64el (14.2.0-13ubuntu1) ... 125s Selecting previously unselected package libasan8:ppc64el. 125s Preparing to unpack .../018-libasan8_14.2.0-13ubuntu1_ppc64el.deb ... 125s Unpacking libasan8:ppc64el (14.2.0-13ubuntu1) ... 125s Selecting previously unselected package liblsan0:ppc64el. 125s Preparing to unpack .../019-liblsan0_14.2.0-13ubuntu1_ppc64el.deb ... 125s Unpacking liblsan0:ppc64el (14.2.0-13ubuntu1) ... 125s Selecting previously unselected package libtsan2:ppc64el. 125s Preparing to unpack .../020-libtsan2_14.2.0-13ubuntu1_ppc64el.deb ... 125s Unpacking libtsan2:ppc64el (14.2.0-13ubuntu1) ... 125s Selecting previously unselected package libubsan1:ppc64el. 125s Preparing to unpack .../021-libubsan1_14.2.0-13ubuntu1_ppc64el.deb ... 125s Unpacking libubsan1:ppc64el (14.2.0-13ubuntu1) ... 125s Selecting previously unselected package libquadmath0:ppc64el. 125s Preparing to unpack .../022-libquadmath0_14.2.0-13ubuntu1_ppc64el.deb ... 125s Unpacking libquadmath0:ppc64el (14.2.0-13ubuntu1) ... 125s Selecting previously unselected package libgcc-14-dev:ppc64el. 125s Preparing to unpack .../023-libgcc-14-dev_14.2.0-13ubuntu1_ppc64el.deb ... 125s Unpacking libgcc-14-dev:ppc64el (14.2.0-13ubuntu1) ... 126s Selecting previously unselected package gcc-14-powerpc64le-linux-gnu. 126s Preparing to unpack .../024-gcc-14-powerpc64le-linux-gnu_14.2.0-13ubuntu1_ppc64el.deb ... 126s Unpacking gcc-14-powerpc64le-linux-gnu (14.2.0-13ubuntu1) ... 126s Selecting previously unselected package gcc-14. 126s Preparing to unpack .../025-gcc-14_14.2.0-13ubuntu1_ppc64el.deb ... 126s Unpacking gcc-14 (14.2.0-13ubuntu1) ... 126s Selecting previously unselected package gcc-powerpc64le-linux-gnu. 126s Preparing to unpack .../026-gcc-powerpc64le-linux-gnu_4%3a14.1.0-2ubuntu1_ppc64el.deb ... 126s Unpacking gcc-powerpc64le-linux-gnu (4:14.1.0-2ubuntu1) ... 126s Selecting previously unselected package gcc. 126s Preparing to unpack .../027-gcc_4%3a14.1.0-2ubuntu1_ppc64el.deb ... 126s Unpacking gcc (4:14.1.0-2ubuntu1) ... 126s Selecting previously unselected package libstdc++-14-dev:ppc64el. 126s Preparing to unpack .../028-libstdc++-14-dev_14.2.0-13ubuntu1_ppc64el.deb ... 126s Unpacking libstdc++-14-dev:ppc64el (14.2.0-13ubuntu1) ... 126s Selecting previously unselected package g++-14-powerpc64le-linux-gnu. 126s Preparing to unpack .../029-g++-14-powerpc64le-linux-gnu_14.2.0-13ubuntu1_ppc64el.deb ... 126s Unpacking g++-14-powerpc64le-linux-gnu (14.2.0-13ubuntu1) ... 127s Selecting previously unselected package g++-14. 127s Preparing to unpack .../030-g++-14_14.2.0-13ubuntu1_ppc64el.deb ... 127s Unpacking g++-14 (14.2.0-13ubuntu1) ... 127s Selecting previously unselected package g++-powerpc64le-linux-gnu. 127s Preparing to unpack .../031-g++-powerpc64le-linux-gnu_4%3a14.1.0-2ubuntu1_ppc64el.deb ... 127s Unpacking g++-powerpc64le-linux-gnu (4:14.1.0-2ubuntu1) ... 127s Selecting previously unselected package g++. 127s Preparing to unpack .../032-g++_4%3a14.1.0-2ubuntu1_ppc64el.deb ... 127s Unpacking g++ (4:14.1.0-2ubuntu1) ... 127s Selecting previously unselected package build-essential. 127s Preparing to unpack .../033-build-essential_12.10ubuntu1_ppc64el.deb ... 127s Unpacking build-essential (12.10ubuntu1) ... 127s Selecting previously unselected package libdebhelper-perl. 127s Preparing to unpack .../034-libdebhelper-perl_13.20ubuntu1_all.deb ... 127s Unpacking libdebhelper-perl (13.20ubuntu1) ... 127s Selecting previously unselected package libtool. 127s Preparing to unpack .../035-libtool_2.4.7-8_all.deb ... 127s Unpacking libtool (2.4.7-8) ... 127s Selecting previously unselected package dh-autoreconf. 127s Preparing to unpack .../036-dh-autoreconf_20_all.deb ... 127s Unpacking dh-autoreconf (20) ... 127s Selecting previously unselected package libarchive-zip-perl. 127s Preparing to unpack .../037-libarchive-zip-perl_1.68-1_all.deb ... 127s Unpacking libarchive-zip-perl (1.68-1) ... 127s Selecting previously unselected package libfile-stripnondeterminism-perl. 127s Preparing to unpack .../038-libfile-stripnondeterminism-perl_1.14.0-1_all.deb ... 127s Unpacking libfile-stripnondeterminism-perl (1.14.0-1) ... 127s Selecting previously unselected package dh-strip-nondeterminism. 127s Preparing to unpack .../039-dh-strip-nondeterminism_1.14.0-1_all.deb ... 127s Unpacking dh-strip-nondeterminism (1.14.0-1) ... 127s Selecting previously unselected package debugedit. 127s Preparing to unpack .../040-debugedit_1%3a5.1-1_ppc64el.deb ... 127s Unpacking debugedit (1:5.1-1) ... 127s Selecting previously unselected package dwz. 127s Preparing to unpack .../041-dwz_0.15-1build6_ppc64el.deb ... 127s Unpacking dwz (0.15-1build6) ... 127s Selecting previously unselected package gettext. 127s Preparing to unpack .../042-gettext_0.22.5-3_ppc64el.deb ... 127s Unpacking gettext (0.22.5-3) ... 127s Selecting previously unselected package intltool-debian. 127s Preparing to unpack .../043-intltool-debian_0.35.0+20060710.6_all.deb ... 127s Unpacking intltool-debian (0.35.0+20060710.6) ... 127s Selecting previously unselected package po-debconf. 127s Preparing to unpack .../044-po-debconf_1.0.21+nmu1_all.deb ... 127s Unpacking po-debconf (1.0.21+nmu1) ... 127s Selecting previously unselected package debhelper. 127s Preparing to unpack .../045-debhelper_13.20ubuntu1_all.deb ... 127s Unpacking debhelper (13.20ubuntu1) ... 127s Selecting previously unselected package dh-python. 127s Preparing to unpack .../046-dh-python_6.20241217_all.deb ... 127s Unpacking dh-python (6.20241217) ... 127s Selecting previously unselected package xml-core. 127s Preparing to unpack .../047-xml-core_0.19_all.deb ... 127s Unpacking xml-core (0.19) ... 127s Selecting previously unselected package docutils-common. 127s Preparing to unpack .../048-docutils-common_0.21.2+dfsg-2_all.deb ... 127s Unpacking docutils-common (0.21.2+dfsg-2) ... 127s Selecting previously unselected package fonts-font-awesome. 127s Preparing to unpack .../049-fonts-font-awesome_5.0.10+really4.7.0~dfsg-4.1_all.deb ... 128s Unpacking fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 128s Selecting previously unselected package libjs-jquery. 128s Preparing to unpack .../050-libjs-jquery_3.6.1+dfsg+~3.5.14-1_all.deb ... 128s Unpacking libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 128s Selecting previously unselected package libjs-underscore. 128s Preparing to unpack .../051-libjs-underscore_1.13.4~dfsg+~1.11.4-3_all.deb ... 128s Unpacking libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 128s Selecting previously unselected package libjs-sphinxdoc. 128s Preparing to unpack .../052-libjs-sphinxdoc_8.1.3-3_all.deb ... 128s Unpacking libjs-sphinxdoc (8.1.3-3) ... 128s Selecting previously unselected package libjson-perl. 128s Preparing to unpack .../053-libjson-perl_4.10000-1_all.deb ... 128s Unpacking libjson-perl (4.10000-1) ... 128s Selecting previously unselected package liblua5.4-0:ppc64el. 128s Preparing to unpack .../054-liblua5.4-0_5.4.7-1_ppc64el.deb ... 128s Unpacking liblua5.4-0:ppc64el (5.4.7-1) ... 128s Selecting previously unselected package libpython3.13-stdlib:ppc64el. 128s Preparing to unpack .../055-libpython3.13-stdlib_3.13.1-2_ppc64el.deb ... 128s Unpacking libpython3.13-stdlib:ppc64el (3.13.1-2) ... 128s Selecting previously unselected package pandoc-data. 128s Preparing to unpack .../056-pandoc-data_3.1.11.1-3build1_all.deb ... 128s Unpacking pandoc-data (3.1.11.1-3build1) ... 128s Selecting previously unselected package pandoc. 128s Preparing to unpack .../057-pandoc_3.1.11.1+ds-2_ppc64el.deb ... 128s Unpacking pandoc (3.1.11.1+ds-2) ... 130s Selecting previously unselected package pybuild-plugin-autopkgtest. 130s Preparing to unpack .../058-pybuild-plugin-autopkgtest_6.20241217_all.deb ... 130s Unpacking pybuild-plugin-autopkgtest (6.20241217) ... 130s Selecting previously unselected package python-vcr-doc. 130s Preparing to unpack .../059-python-vcr-doc_6.0.2-2_all.deb ... 130s Unpacking python-vcr-doc (6.0.2-2) ... 130s Selecting previously unselected package python3-aiohappyeyeballs. 130s Preparing to unpack .../060-python3-aiohappyeyeballs_2.4.4-2_all.deb ... 130s Unpacking python3-aiohappyeyeballs (2.4.4-2) ... 130s Selecting previously unselected package python3-multidict. 130s Preparing to unpack .../061-python3-multidict_6.1.0-1build1_ppc64el.deb ... 130s Unpacking python3-multidict (6.1.0-1build1) ... 130s Selecting previously unselected package python3-yarl. 130s Preparing to unpack .../062-python3-yarl_1.13.1-1build1_ppc64el.deb ... 130s Unpacking python3-yarl (1.13.1-1build1) ... 130s Selecting previously unselected package python3-async-timeout. 130s Preparing to unpack .../063-python3-async-timeout_5.0.1-1_all.deb ... 130s Unpacking python3-async-timeout (5.0.1-1) ... 130s Selecting previously unselected package python3-frozenlist. 130s Preparing to unpack .../064-python3-frozenlist_1.5.0-1build1_ppc64el.deb ... 130s Unpacking python3-frozenlist (1.5.0-1build1) ... 130s Selecting previously unselected package python3-aiosignal. 130s Preparing to unpack .../065-python3-aiosignal_1.3.2-1_all.deb ... 130s Unpacking python3-aiosignal (1.3.2-1) ... 130s Selecting previously unselected package python3-aiohttp. 130s Preparing to unpack .../066-python3-aiohttp_3.10.11-1_ppc64el.deb ... 130s Unpacking python3-aiohttp (3.10.11-1) ... 130s Selecting previously unselected package python3.13. 130s Preparing to unpack .../067-python3.13_3.13.1-2_ppc64el.deb ... 130s Unpacking python3.13 (3.13.1-2) ... 130s Selecting previously unselected package python3-all. 130s Preparing to unpack .../068-python3-all_3.12.8-1_ppc64el.deb ... 130s Unpacking python3-all (3.12.8-1) ... 130s Selecting previously unselected package python3-dateutil. 130s Preparing to unpack .../069-python3-dateutil_2.9.0-3_all.deb ... 130s Unpacking python3-dateutil (2.9.0-3) ... 130s Selecting previously unselected package python3-jmespath. 130s Preparing to unpack .../070-python3-jmespath_1.0.1-1_all.deb ... 130s Unpacking python3-jmespath (1.0.1-1) ... 130s Selecting previously unselected package python3-six. 130s Preparing to unpack .../071-python3-six_1.17.0-1_all.deb ... 130s Unpacking python3-six (1.17.0-1) ... 130s Selecting previously unselected package python3-botocore. 130s Preparing to unpack .../072-python3-botocore_1.34.46+repack-1ubuntu1_all.deb ... 130s Unpacking python3-botocore (1.34.46+repack-1ubuntu1) ... 131s Selecting previously unselected package python3-s3transfer. 131s Preparing to unpack .../073-python3-s3transfer_0.10.1-1ubuntu2_all.deb ... 131s Unpacking python3-s3transfer (0.10.1-1ubuntu2) ... 131s Selecting previously unselected package python3-boto3. 131s Preparing to unpack .../074-python3-boto3_1.34.46+dfsg-1ubuntu1_all.deb ... 131s Unpacking python3-boto3 (1.34.46+dfsg-1ubuntu1) ... 131s Selecting previously unselected package python3-brotli. 131s Preparing to unpack .../075-python3-brotli_1.1.0-2build3_ppc64el.deb ... 131s Unpacking python3-brotli (1.1.0-2build3) ... 131s Selecting previously unselected package python3-brotlicffi. 131s Preparing to unpack .../076-python3-brotlicffi_1.1.0.0+ds1-1_ppc64el.deb ... 131s Unpacking python3-brotlicffi (1.1.0.0+ds1-1) ... 131s Selecting previously unselected package python3-click. 131s Preparing to unpack .../077-python3-click_8.1.8-1_all.deb ... 131s Unpacking python3-click (8.1.8-1) ... 131s Selecting previously unselected package python3-decorator. 131s Preparing to unpack .../078-python3-decorator_5.1.1-5_all.deb ... 131s Unpacking python3-decorator (5.1.1-5) ... 131s Selecting previously unselected package python3-defusedxml. 131s Preparing to unpack .../079-python3-defusedxml_0.7.1-3_all.deb ... 131s Unpacking python3-defusedxml (0.7.1-3) ... 131s Selecting previously unselected package python3-roman. 131s Preparing to unpack .../080-python3-roman_4.2-1_all.deb ... 131s Unpacking python3-roman (4.2-1) ... 131s Selecting previously unselected package python3-docutils. 131s Preparing to unpack .../081-python3-docutils_0.21.2+dfsg-2_all.deb ... 131s Unpacking python3-docutils (0.21.2+dfsg-2) ... 132s Selecting previously unselected package python3-itsdangerous. 132s Preparing to unpack .../082-python3-itsdangerous_2.2.0-1_all.deb ... 132s Unpacking python3-itsdangerous (2.2.0-1) ... 132s Selecting previously unselected package python3-werkzeug. 132s Preparing to unpack .../083-python3-werkzeug_3.1.3-2_all.deb ... 132s Unpacking python3-werkzeug (3.1.3-2) ... 132s Selecting previously unselected package python3-flask. 132s Preparing to unpack .../084-python3-flask_3.1.0-2ubuntu1_all.deb ... 132s Unpacking python3-flask (3.1.0-2ubuntu1) ... 132s Selecting previously unselected package python3-mistune. 132s Preparing to unpack .../085-python3-mistune_3.0.2-2_all.deb ... 132s Unpacking python3-mistune (3.0.2-2) ... 132s Selecting previously unselected package python3-packaging. 132s Preparing to unpack .../086-python3-packaging_24.2-1_all.deb ... 132s Unpacking python3-packaging (24.2-1) ... 132s Selecting previously unselected package python3-flasgger. 132s Preparing to unpack .../087-python3-flasgger_0.9.7.2~dev2+dfsg-3_all.deb ... 132s Unpacking python3-flasgger (0.9.7.2~dev2+dfsg-3) ... 132s Selecting previously unselected package python3-greenlet. 132s Preparing to unpack .../088-python3-greenlet_3.1.0-1_ppc64el.deb ... 132s Unpacking python3-greenlet (3.1.0-1) ... 132s Selecting previously unselected package python3-httpbin. 132s Preparing to unpack .../089-python3-httpbin_0.10.2+dfsg-2_all.deb ... 132s Unpacking python3-httpbin (0.10.2+dfsg-2) ... 132s Selecting previously unselected package python3-imagesize. 132s Preparing to unpack .../090-python3-imagesize_1.4.1-1_all.deb ... 132s Unpacking python3-imagesize (1.4.1-1) ... 132s Selecting previously unselected package python3-iniconfig. 132s Preparing to unpack .../091-python3-iniconfig_1.1.1-2_all.deb ... 132s Unpacking python3-iniconfig (1.1.1-2) ... 132s Selecting previously unselected package python3-pluggy. 132s Preparing to unpack .../092-python3-pluggy_1.5.0-1_all.deb ... 132s Unpacking python3-pluggy (1.5.0-1) ... 132s Selecting previously unselected package python3-pytest. 132s Preparing to unpack .../093-python3-pytest_8.3.4-1_all.deb ... 132s Unpacking python3-pytest (8.3.4-1) ... 132s Selecting previously unselected package python3-pytest-httpbin. 132s Preparing to unpack .../094-python3-pytest-httpbin_2.1.0-1_all.deb ... 132s Unpacking python3-pytest-httpbin (2.1.0-1) ... 132s Selecting previously unselected package python3-tornado. 132s Preparing to unpack .../095-python3-tornado_6.4.1-3_ppc64el.deb ... 132s Unpacking python3-tornado (6.4.1-3) ... 132s Selecting previously unselected package python3-pytest-tornado. 132s Preparing to unpack .../096-python3-pytest-tornado_0.8.1-3_all.deb ... 132s Unpacking python3-pytest-tornado (0.8.1-3) ... 132s Selecting previously unselected package python3-snowballstemmer. 132s Preparing to unpack .../097-python3-snowballstemmer_2.2.0-4build1_all.deb ... 132s Unpacking python3-snowballstemmer (2.2.0-4build1) ... 132s Selecting previously unselected package sphinx-common. 132s Preparing to unpack .../098-sphinx-common_8.1.3-3_all.deb ... 132s Unpacking sphinx-common (8.1.3-3) ... 133s Selecting previously unselected package python3-alabaster. 133s Preparing to unpack .../099-python3-alabaster_0.7.16-0.1_all.deb ... 133s Unpacking python3-alabaster (0.7.16-0.1) ... 133s Selecting previously unselected package python3-sphinx. 133s Preparing to unpack .../100-python3-sphinx_8.1.3-3_all.deb ... 133s Unpacking python3-sphinx (8.1.3-3) ... 133s Selecting previously unselected package sphinx-rtd-theme-common. 133s Preparing to unpack .../101-sphinx-rtd-theme-common_3.0.2+dfsg-1_all.deb ... 133s Unpacking sphinx-rtd-theme-common (3.0.2+dfsg-1) ... 133s Selecting previously unselected package python3-sphinxcontrib.jquery. 133s Preparing to unpack .../102-python3-sphinxcontrib.jquery_4.1-5_all.deb ... 133s Unpacking python3-sphinxcontrib.jquery (4.1-5) ... 133s Selecting previously unselected package python3-sphinx-rtd-theme. 133s Preparing to unpack .../103-python3-sphinx-rtd-theme_3.0.2+dfsg-1_all.deb ... 133s Unpacking python3-sphinx-rtd-theme (3.0.2+dfsg-1) ... 133s Selecting previously unselected package python3-wrapt. 133s Preparing to unpack .../104-python3-wrapt_1.15.0-4_ppc64el.deb ... 133s Unpacking python3-wrapt (1.15.0-4) ... 133s Selecting previously unselected package python3-vcr. 133s Preparing to unpack .../105-python3-vcr_6.0.2-2_all.deb ... 133s Unpacking python3-vcr (6.0.2-2) ... 133s Setting up dh-python (6.20241217) ... 133s Setting up python3-iniconfig (1.1.1-2) ... 134s Setting up python3-tornado (6.4.1-3) ... 135s Setting up python3-brotlicffi (1.1.0.0+ds1-1) ... 135s Setting up fonts-lato (2.015-1) ... 135s Setting up python3-defusedxml (0.7.1-3) ... 135s Setting up libarchive-zip-perl (1.68-1) ... 135s Setting up python3-alabaster (0.7.16-0.1) ... 135s Setting up libdebhelper-perl (13.20ubuntu1) ... 135s Setting up m4 (1.4.19-4build1) ... 135s Setting up python3-itsdangerous (2.2.0-1) ... 136s Setting up libgomp1:ppc64el (14.2.0-13ubuntu1) ... 136s Setting up python3-click (8.1.8-1) ... 136s Setting up python3-multidict (6.1.0-1build1) ... 136s Setting up python3-frozenlist (1.5.0-1build1) ... 137s Setting up python3-aiosignal (1.3.2-1) ... 137s Setting up python3-async-timeout (5.0.1-1) ... 137s Setting up python3-six (1.17.0-1) ... 137s Setting up libpython3.13-minimal:ppc64el (3.13.1-2) ... 137s Setting up python3-roman (4.2-1) ... 137s Setting up python3-decorator (5.1.1-5) ... 138s Setting up autotools-dev (20220109.1) ... 138s Setting up python3-packaging (24.2-1) ... 138s Setting up python3-snowballstemmer (2.2.0-4build1) ... 139s Setting up python3-werkzeug (3.1.3-2) ... 139s Setting up python3-jmespath (1.0.1-1) ... 140s Setting up python3-brotli (1.1.0-2build3) ... 140s Setting up python3-greenlet (3.1.0-1) ... 140s Setting up libquadmath0:ppc64el (14.2.0-13ubuntu1) ... 140s Setting up libmpc3:ppc64el (1.3.1-1build2) ... 140s Setting up python3-wrapt (1.15.0-4) ... 140s Setting up autopoint (0.22.5-3) ... 140s Setting up python3-aiohappyeyeballs (2.4.4-2) ... 141s Setting up autoconf (2.72-3) ... 141s Setting up python3-pluggy (1.5.0-1) ... 141s Setting up libubsan1:ppc64el (14.2.0-13ubuntu1) ... 141s Setting up dwz (0.15-1build6) ... 141s Setting up libasan8:ppc64el (14.2.0-13ubuntu1) ... 141s Setting up libjson-perl (4.10000-1) ... 141s Setting up debugedit (1:5.1-1) ... 141s Setting up liblua5.4-0:ppc64el (5.4.7-1) ... 141s Setting up python3.13-minimal (3.13.1-2) ... 142s Setting up python3-dateutil (2.9.0-3) ... 143s Setting up sgml-base (1.31) ... 143s Setting up pandoc-data (3.1.11.1-3build1) ... 143s Setting up libtsan2:ppc64el (14.2.0-13ubuntu1) ... 143s Setting up libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 143s Setting up libisl23:ppc64el (0.27-1) ... 143s Setting up python3-yarl (1.13.1-1build1) ... 143s Setting up python3-mistune (3.0.2-2) ... 143s Setting up libpython3.13-stdlib:ppc64el (3.13.1-2) ... 143s Setting up fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 143s Setting up sphinx-rtd-theme-common (3.0.2+dfsg-1) ... 143s Setting up libcc1-0:ppc64el (14.2.0-13ubuntu1) ... 143s Setting up liblsan0:ppc64el (14.2.0-13ubuntu1) ... 143s Setting up libitm1:ppc64el (14.2.0-13ubuntu1) ... 143s Setting up libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 143s Setting up python3-imagesize (1.4.1-1) ... 144s Setting up automake (1:1.16.5-1.3ubuntu1) ... 144s update-alternatives: using /usr/bin/automake-1.16 to provide /usr/bin/automake (automake) in auto mode 144s Setting up libfile-stripnondeterminism-perl (1.14.0-1) ... 144s Setting up gettext (0.22.5-3) ... 144s Setting up python3.13 (3.13.1-2) ... 145s Setting up python3-pytest (8.3.4-1) ... 146s Setting up python3-flask (3.1.0-2ubuntu1) ... 146s Setting up python3-aiohttp (3.10.11-1) ... 147s Setting up python3-all (3.12.8-1) ... 147s Setting up intltool-debian (0.35.0+20060710.6) ... 147s Setting up pandoc (3.1.11.1+ds-2) ... 147s Setting up python3-pytest-tornado (0.8.1-3) ... 147s Setting up python3-botocore (1.34.46+repack-1ubuntu1) ... 148s Setting up python3-vcr (6.0.2-2) ... 148s Setting up cpp-14-powerpc64le-linux-gnu (14.2.0-13ubuntu1) ... 148s Setting up libjs-sphinxdoc (8.1.3-3) ... 148s Setting up cpp-14 (14.2.0-13ubuntu1) ... 148s Setting up dh-strip-nondeterminism (1.14.0-1) ... 148s Setting up xml-core (0.19) ... 148s Setting up libgcc-14-dev:ppc64el (14.2.0-13ubuntu1) ... 148s Setting up libstdc++-14-dev:ppc64el (14.2.0-13ubuntu1) ... 148s Setting up cpp-powerpc64le-linux-gnu (4:14.1.0-2ubuntu1) ... 148s Setting up gcc-14-powerpc64le-linux-gnu (14.2.0-13ubuntu1) ... 148s Setting up python-vcr-doc (6.0.2-2) ... 148s Setting up g++-14-powerpc64le-linux-gnu (14.2.0-13ubuntu1) ... 148s Setting up python3-flasgger (0.9.7.2~dev2+dfsg-3) ... 148s Setting up po-debconf (1.0.21+nmu1) ... 148s Setting up python3-s3transfer (0.10.1-1ubuntu2) ... 149s Setting up gcc-14 (14.2.0-13ubuntu1) ... 149s Setting up gcc-powerpc64le-linux-gnu (4:14.1.0-2ubuntu1) ... 149s Setting up sphinx-common (8.1.3-3) ... 149s Setting up python3-boto3 (1.34.46+dfsg-1ubuntu1) ... 149s Setting up python3-httpbin (0.10.2+dfsg-2) ... 149s Setting up cpp (4:14.1.0-2ubuntu1) ... 149s Setting up python3-pytest-httpbin (2.1.0-1) ... 150s Setting up g++-14 (14.2.0-13ubuntu1) ... 150s Setting up g++-powerpc64le-linux-gnu (4:14.1.0-2ubuntu1) ... 150s Setting up libtool (2.4.7-8) ... 150s Setting up gcc (4:14.1.0-2ubuntu1) ... 150s Setting up dh-autoreconf (20) ... 150s Setting up g++ (4:14.1.0-2ubuntu1) ... 150s update-alternatives: using /usr/bin/g++ to provide /usr/bin/c++ (c++) in auto mode 150s Setting up build-essential (12.10ubuntu1) ... 150s Setting up debhelper (13.20ubuntu1) ... 150s Setting up pybuild-plugin-autopkgtest (6.20241217) ... 150s Processing triggers for install-info (7.1.1-1) ... 150s Processing triggers for libc-bin (2.40-4ubuntu1) ... 150s Processing triggers for systemd (257-2ubuntu1) ... 150s Processing triggers for man-db (2.13.0-1) ... 154s Processing triggers for sgml-base (1.31) ... 154s Setting up docutils-common (0.21.2+dfsg-2) ... 154s Processing triggers for sgml-base (1.31) ... 154s Setting up python3-docutils (0.21.2+dfsg-2) ... 155s Setting up python3-sphinx (8.1.3-3) ... 156s Setting up python3-sphinxcontrib.jquery (4.1-5) ... 157s Setting up python3-sphinx-rtd-theme (3.0.2+dfsg-1) ... 158s autopkgtest [03:25:32]: test pybuild-autopkgtest: pybuild-autopkgtest 158s autopkgtest [03:25:32]: test pybuild-autopkgtest: [----------------------- 158s pybuild-autopkgtest 159s I: pybuild base:311: cd /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build; python3.13 -m pytest --ignore tests/integration/test_aiohttp.py --ignore tests/integration/test_tornado.py --ignore tests/integration/test_requests.py -m "not online" -k "not test_basic_json_use and not test_load_cassette_with_custom_persister" 162s ============================= test session starts ============================== 162s platform linux -- Python 3.13.1, pytest-8.3.4, pluggy-1.5.0 162s rootdir: /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build 162s plugins: typeguard-4.4.1, httpbin-2.1.0, tornado-0.8.1 162s collected 305 items / 19 deselected / 1 skipped / 286 selected 162s 162s tests/integration/test_basic.py .... [ 1%] 162s tests/integration/test_boto3.py ss [ 2%] 162s tests/integration/test_config.py . [ 2%] 162s tests/integration/test_filter.py .......... [ 5%] 162s tests/integration/test_httplib2.py ........ [ 8%] 162s tests/integration/test_urllib2.py ........ [ 11%] 162s tests/integration/test_urllib3.py F127.0.0.1 - - [18/Jan/2025 03:25:36] "GET / HTTP/1.1" 200 9358 163s FFFFFF [ 13%] 163s tests/integration/test_httplib2.py ........ [ 16%] 163s tests/integration/test_urllib2.py ........ [ 19%] 163s tests/integration/test_urllib3.py FFFFFFF [ 22%] 163s tests/integration/test_httplib2.py . [ 22%] 164s tests/integration/test_ignore.py .... [ 23%] 164s tests/integration/test_matchers.py .............. [ 28%] 164s tests/integration/test_multiple.py . [ 29%] 164s tests/integration/test_proxy.py F [ 29%] 164s tests/integration/test_record_mode.py ........ [ 32%] 164s tests/integration/test_register_persister.py .. [ 32%] 164s tests/integration/test_register_serializer.py . [ 33%] 164s tests/integration/test_request.py .. [ 33%] 164s tests/integration/test_stubs.py .... [ 35%] 164s tests/integration/test_urllib2.py . [ 35%] 164s tests/integration/test_urllib3.py FF. [ 36%] 164s tests/integration/test_wild.py F.F. [ 38%] 164s tests/unit/test_cassettes.py ............................... [ 48%] 164s tests/unit/test_errors.py .... [ 50%] 164s tests/unit/test_filters.py ........................ [ 58%] 164s tests/unit/test_json_serializer.py . [ 59%] 164s tests/unit/test_matchers.py ............................ [ 68%] 164s tests/unit/test_migration.py ... [ 69%] 164s tests/unit/test_persist.py .... [ 71%] 164s tests/unit/test_request.py ................. [ 77%] 164s tests/unit/test_response.py .... [ 78%] 165s tests/unit/test_serialize.py ............... [ 83%] 165s tests/unit/test_stubs.py ... [ 84%] 165s tests/unit/test_unittest.py ....... [ 87%] 165s tests/unit/test_util.py ........... [ 91%] 165s tests/unit/test_vcr.py ........................ [ 99%] 166s tests/unit/test_vcr_import.py . [100%] 166s 166s =================================== FAILURES =================================== 166s ____________________________ test_status_code[http] ____________________________ 166s 166s httpbin_both = 166s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_status_code_http_0') 166s verify_pool_mgr = 166s 166s def test_status_code(httpbin_both, tmpdir, verify_pool_mgr): 166s """Ensure that we can read the status code""" 166s url = httpbin_both.url 166s with vcr.use_cassette(str(tmpdir.join("atts.yaml"))): 166s > status_code = verify_pool_mgr.request("GET", url).status 166s 166s tests/integration/test_urllib3.py:34: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 166s return self.request_encode_url( 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 166s return self.urlopen(method, url, **extra_kw) 166s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 166s response = conn.urlopen(method, u.request_uri, **kw) 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 166s response = self._make_request( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s self = 166s conn = 166s method = 'GET', url = '/', body = None, headers = {} 166s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 166s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 166s chunked = False, response_conn = None, preload_content = True 166s decode_content = True, enforce_content_length = True 166s 166s def _make_request( 166s self, 166s conn: BaseHTTPConnection, 166s method: str, 166s url: str, 166s body: _TYPE_BODY | None = None, 166s headers: typing.Mapping[str, str] | None = None, 166s retries: Retry | None = None, 166s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 166s chunked: bool = False, 166s response_conn: BaseHTTPConnection | None = None, 166s preload_content: bool = True, 166s decode_content: bool = True, 166s enforce_content_length: bool = True, 166s ) -> BaseHTTPResponse: 166s """ 166s Perform a request on a given urllib connection object taken from our 166s pool. 166s 166s :param conn: 166s a connection from one of our connection pools 166s 166s :param method: 166s HTTP request method (such as GET, POST, PUT, etc.) 166s 166s :param url: 166s The URL to perform the request on. 166s 166s :param body: 166s Data to send in the request body, either :class:`str`, :class:`bytes`, 166s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 166s 166s :param headers: 166s Dictionary of custom headers to send, such as User-Agent, 166s If-None-Match, etc. If None, pool headers are used. If provided, 166s these headers completely replace any pool-specific headers. 166s 166s :param retries: 166s Configure the number of retries to allow before raising a 166s :class:`~urllib3.exceptions.MaxRetryError` exception. 166s 166s Pass ``None`` to retry until you receive a response. Pass a 166s :class:`~urllib3.util.retry.Retry` object for fine-grained control 166s over different types of retries. 166s Pass an integer number to retry connection errors that many times, 166s but no other types of errors. Pass zero to never retry. 166s 166s If ``False``, then retries are disabled and any exception is raised 166s immediately. Also, instead of raising a MaxRetryError on redirects, 166s the redirect response will be returned. 166s 166s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 166s 166s :param timeout: 166s If specified, overrides the default timeout for this one 166s request. It may be a float (in seconds) or an instance of 166s :class:`urllib3.util.Timeout`. 166s 166s :param chunked: 166s If True, urllib3 will send the body using chunked transfer 166s encoding. Otherwise, urllib3 will send the body using the standard 166s content-length form. Defaults to False. 166s 166s :param response_conn: 166s Set this to ``None`` if you will handle releasing the connection or 166s set the connection to have the response release it. 166s 166s :param preload_content: 166s If True, the response's body will be preloaded during construction. 166s 166s :param decode_content: 166s If True, will attempt to decode the body based on the 166s 'content-encoding' header. 166s 166s :param enforce_content_length: 166s Enforce content length checking. Body returned by server must match 166s value of Content-Length header, if present. Otherwise, raise error. 166s """ 166s self.num_requests += 1 166s 166s timeout_obj = self._get_timeout(timeout) 166s timeout_obj.start_connect() 166s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 166s 166s try: 166s # Trigger any extra validation we need to do. 166s try: 166s self._validate_conn(conn) 166s except (SocketTimeout, BaseSSLError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 166s raise 166s 166s # _validate_conn() starts the connection to an HTTPS proxy 166s # so we need to wrap errors with 'ProxyError' here too. 166s except ( 166s OSError, 166s NewConnectionError, 166s TimeoutError, 166s BaseSSLError, 166s CertificateError, 166s SSLError, 166s ) as e: 166s new_e: Exception = e 166s if isinstance(e, (BaseSSLError, CertificateError)): 166s new_e = SSLError(e) 166s # If the connection didn't successfully connect to it's proxy 166s # then there 166s if isinstance( 166s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 166s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 166s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 166s raise new_e 166s 166s # conn.request() calls http.client.*.request, not the method in 166s # urllib3.request. It also calls makefile (recv) on the socket. 166s try: 166s conn.request( 166s method, 166s url, 166s body=body, 166s headers=headers, 166s chunked=chunked, 166s preload_content=preload_content, 166s decode_content=decode_content, 166s enforce_content_length=enforce_content_length, 166s ) 166s 166s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 166s # legitimately able to close the connection after sending a valid response. 166s # With this behaviour, the received response is still readable. 166s except BrokenPipeError: 166s pass 166s except OSError as e: 166s # MacOS/Linux 166s # EPROTOTYPE and ECONNRESET are needed on macOS 166s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 166s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 166s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 166s raise 166s 166s # Reset the timeout for the recv() on the socket 166s read_timeout = timeout_obj.read_timeout 166s 166s if not conn.is_closed: 166s # In Python 3 socket.py will catch EAGAIN and return None when you 166s # try and read into the file pointer created by http.client, which 166s # instead raises a BadStatusLine exception. Instead of catching 166s # the exception and assuming all BadStatusLine exceptions are read 166s # timeouts, check for a zero timeout before making the request. 166s if read_timeout == 0: 166s raise ReadTimeoutError( 166s self, url, f"Read timed out. (read timeout={read_timeout})" 166s ) 166s conn.timeout = read_timeout 166s 166s # Receive the response from the server 166s try: 166s response = conn.getresponse() 166s except (BaseSSLError, OSError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 166s raise 166s 166s # Set properties that are used by the pooling layer. 166s response.retries = retries 166s response._connection = response_conn # type: ignore[attr-defined] 166s response._pool = self # type: ignore[attr-defined] 166s 166s log.debug( 166s '%s://%s:%s "%s %s %s" %s %s', 166s self.scheme, 166s self.host, 166s self.port, 166s method, 166s url, 166s > response.version_string, 166s response.status, 166s response.length_remaining, 166s ) 166s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 166s 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 166s ----------------------------- Captured stderr call ----------------------------- 166s 127.0.0.1 - - [18/Jan/2025 03:25:36] "GET / HTTP/1.1" 200 9358 166s ______________________________ test_headers[http] ______________________________ 166s 166s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_headers_http_0') 166s httpbin_both = 166s verify_pool_mgr = 166s 166s def test_headers(tmpdir, httpbin_both, verify_pool_mgr): 166s """Ensure that we can read the headers back""" 166s url = httpbin_both.url 166s with vcr.use_cassette(str(tmpdir.join("headers.yaml"))): 166s > headers = verify_pool_mgr.request("GET", url).headers 166s 166s tests/integration/test_urllib3.py:44: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 166s return self.request_encode_url( 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 166s return self.urlopen(method, url, **extra_kw) 166s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 166s response = conn.urlopen(method, u.request_uri, **kw) 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 166s response = self._make_request( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s self = 166s conn = 166s method = 'GET', url = '/', body = None, headers = {} 166s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 166s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 166s chunked = False, response_conn = None, preload_content = True 166s decode_content = True, enforce_content_length = True 166s 166s def _make_request( 166s self, 166s conn: BaseHTTPConnection, 166s method: str, 166s url: str, 166s body: _TYPE_BODY | None = None, 166s headers: typing.Mapping[str, str] | None = None, 166s retries: Retry | None = None, 166s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 166s chunked: bool = False, 166s response_conn: BaseHTTPConnection | None = None, 166s preload_content: bool = True, 166s decode_content: bool = True, 166s enforce_content_length: bool = True, 166s ) -> BaseHTTPResponse: 166s """ 166s Perform a request on a given urllib connection object taken from our 166s pool. 166s 166s :param conn: 166s a connection from one of our connection pools 166s 166s :param method: 166s HTTP request method (such as GET, POST, PUT, etc.) 166s 166s :param url: 166s The URL to perform the request on. 166s 166s :param body: 166s Data to send in the request body, either :class:`str`, :class:`bytes`, 166s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 166s 166s :param headers: 166s Dictionary of custom headers to send, such as User-Agent, 166s If-None-Match, etc. If None, pool headers are used. If provided, 166s these headers completely replace any pool-specific headers. 166s 166s :param retries: 166s Configure the number of retries to allow before raising a 166s :class:`~urllib3.exceptions.MaxRetryError` exception. 166s 166s Pass ``None`` to retry until you receive a response. Pass a 166s :class:`~urllib3.util.retry.Retry` object for fine-grained control 166s over different types of retries. 166s Pass an integer number to retry connection errors that many times, 166s but no other types of errors. Pass zero to never retry. 166s 166s If ``False``, then retries are disabled and any exception is raised 166s immediately. Also, instead of raising a MaxRetryError on redirects, 166s the redirect response will be returned. 166s 166s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 166s 166s :param timeout: 166s If specified, overrides the default timeout for this one 166s request. It may be a float (in seconds) or an instance of 166s :class:`urllib3.util.Timeout`. 166s 166s :param chunked: 166s If True, urllib3 will send the body using chunked transfer 166s encoding. Otherwise, urllib3 will send the body using the standard 166s content-length form. Defaults to False. 166s 166s :param response_conn: 166s Set this to ``None`` if you will handle releasing the connection or 166s set the connection to have the response release it. 166s 166s :param preload_content: 166s If True, the response's body will be preloaded during construction. 166s 166s :param decode_content: 166s If True, will attempt to decode the body based on the 166s 'content-encoding' header. 166s 166s :param enforce_content_length: 166s Enforce content length checking. Body returned by server must match 166s value of Content-Length header, if present. Otherwise, raise error. 166s """ 166s self.num_requests += 1 166s 166s timeout_obj = self._get_timeout(timeout) 166s timeout_obj.start_connect() 166s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 166s 166s try: 166s # Trigger any extra validation we need to do. 166s try: 166s self._validate_conn(conn) 166s except (SocketTimeout, BaseSSLError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 166s raise 166s 166s # _validate_conn() starts the connection to an HTTPS proxy 166s # so we need to wrap errors with 'ProxyError' here too. 166s except ( 166s OSError, 166s NewConnectionError, 166s TimeoutError, 166s BaseSSLError, 166s CertificateError, 166s SSLError, 166s ) as e: 166s new_e: Exception = e 166s if isinstance(e, (BaseSSLError, CertificateError)): 166s new_e = SSLError(e) 166s # If the connection didn't successfully connect to it's proxy 166s # then there 166s if isinstance( 166s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 166s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 166s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 166s raise new_e 166s 166s # conn.request() calls http.client.*.request, not the method in 166s # urllib3.request. It also calls makefile (recv) on the socket. 166s try: 166s conn.request( 166s method, 166s url, 166s body=body, 166s headers=headers, 166s chunked=chunked, 166s preload_content=preload_content, 166s decode_content=decode_content, 166s enforce_content_length=enforce_content_length, 166s ) 166s 166s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 166s # legitimately able to close the connection after sending a valid response. 166s # With this behaviour, the received response is still readable. 166s except BrokenPipeError: 166s pass 166s except OSError as e: 166s # MacOS/Linux 166s # EPROTOTYPE and ECONNRESET are needed on macOS 166s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 166s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 166s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 166s raise 166s 166s # Reset the timeout for the recv() on the socket 166s read_timeout = timeout_obj.read_timeout 166s 166s if not conn.is_closed: 166s # In Python 3 socket.py will catch EAGAIN and return None when you 166s # try and read into the file pointer created by http.client, which 166s # instead raises a BadStatusLine exception. Instead of catching 166s # the exception and assuming all BadStatusLine exceptions are read 166s # timeouts, check for a zero timeout before making the request. 166s if read_timeout == 0: 166s raise ReadTimeoutError( 166s self, url, f"Read timed out. (read timeout={read_timeout})" 166s ) 166s conn.timeout = read_timeout 166s 166s # Receive the response from the server 166s try: 166s response = conn.getresponse() 166s except (BaseSSLError, OSError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 166s raise 166s 166s # Set properties that are used by the pooling layer. 166s response.retries = retries 166s response._connection = response_conn # type: ignore[attr-defined] 166s response._pool = self # type: ignore[attr-defined] 166s 166s log.debug( 166s '%s://%s:%s "%s %s %s" %s %s', 166s self.scheme, 166s self.host, 166s self.port, 166s method, 166s url, 166s > response.version_string, 166s response.status, 166s response.length_remaining, 166s ) 166s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 166s 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 166s _______________________________ test_body[http] ________________________________ 166s 166s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_body_http_0') 166s httpbin_both = 166s verify_pool_mgr = 166s 166s def test_body(tmpdir, httpbin_both, verify_pool_mgr): 166s """Ensure the responses are all identical enough""" 166s url = httpbin_both.url + "/bytes/1024" 166s with vcr.use_cassette(str(tmpdir.join("body.yaml"))): 166s > content = verify_pool_mgr.request("GET", url).data 166s 166s tests/integration/test_urllib3.py:55: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 166s return self.request_encode_url( 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 166s return self.urlopen(method, url, **extra_kw) 166s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 166s response = conn.urlopen(method, u.request_uri, **kw) 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 166s response = self._make_request( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s self = 166s conn = 166s method = 'GET', url = '/bytes/1024', body = None, headers = {} 166s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 166s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 166s chunked = False, response_conn = None, preload_content = True 166s decode_content = True, enforce_content_length = True 166s 166s def _make_request( 166s self, 166s conn: BaseHTTPConnection, 166s method: str, 166s url: str, 166s body: _TYPE_BODY | None = None, 166s headers: typing.Mapping[str, str] | None = None, 166s retries: Retry | None = None, 166s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 166s chunked: bool = False, 166s response_conn: BaseHTTPConnection | None = None, 166s preload_content: bool = True, 166s decode_content: bool = True, 166s enforce_content_length: bool = True, 166s ) -> BaseHTTPResponse: 166s """ 166s Perform a request on a given urllib connection object taken from our 166s pool. 166s 166s :param conn: 166s a connection from one of our connection pools 166s 166s :param method: 166s HTTP request method (such as GET, POST, PUT, etc.) 166s 166s :param url: 166s The URL to perform the request on. 166s 166s :param body: 166s Data to send in the request body, either :class:`str`, :class:`bytes`, 166s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 166s 166s :param headers: 166s Dictionary of custom headers to send, such as User-Agent, 166s If-None-Match, etc. If None, pool headers are used. If provided, 166s these headers completely replace any pool-specific headers. 166s 166s :param retries: 166s Configure the number of retries to allow before raising a 166s :class:`~urllib3.exceptions.MaxRetryError` exception. 166s 166s Pass ``None`` to retry until you receive a response. Pass a 166s :class:`~urllib3.util.retry.Retry` object for fine-grained control 166s over different types of retries. 166s Pass an integer number to retry connection errors that many times, 166s but no other types of errors. Pass zero to never retry. 166s 166s If ``False``, then retries are disabled and any exception is raised 166s immediately. Also, instead of raising a MaxRetryError on redirects, 166s the redirect response will be returned. 166s 166s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 166s 166s :param timeout: 166s If specified, overrides the default timeout for this one 166s request. It may be a float (in seconds) or an instance of 166s :class:`urllib3.util.Timeout`. 166s 166s :param chunked: 166s If True, urllib3 will send the body using chunked transfer 166s encoding. Otherwise, urllib3 will send the body using the standard 166s content-length form. Defaults to False. 166s 166s :param response_conn: 166s Set this to ``None`` if you will handle releasing the connection or 166s set the connection to have the response release it. 166s 166s :param preload_content: 166s If True, the response's body will be preloaded during construction. 166s 166s :param decode_content: 166s If True, will attempt to decode the body based on the 166s 'content-encoding' header. 166s 166s :param enforce_content_length: 166s Enforce content length checking. Body returned by server must match 166s value of Content-Length header, if present. Otherwise, raise error. 166s """ 166s self.num_requests += 1 166s 166s timeout_obj = self._get_timeout(timeout) 166s timeout_obj.start_connect() 166s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 166s 166s try: 166s # Trigger any extra validation we need to do. 166s try: 166s self._validate_conn(conn) 166s except (SocketTimeout, BaseSSLError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 166s raise 166s 166s # _validate_conn() starts the connection to an HTTPS proxy 166s # so we need to wrap errors with 'ProxyError' here too. 166s except ( 166s OSError, 166s NewConnectionError, 166s TimeoutError, 166s BaseSSLError, 166s CertificateError, 166s SSLError, 166s ) as e: 166s new_e: Exception = e 166s if isinstance(e, (BaseSSLError, CertificateError)): 166s new_e = SSLError(e) 166s # If the connection didn't successfully connect to it's proxy 166s # then there 166s if isinstance( 166s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 166s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 166s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 166s raise new_e 166s 166s # conn.request() calls http.client.*.request, not the method in 166s # urllib3.request. It also calls makefile (recv) on the socket. 166s try: 166s conn.request( 166s method, 166s url, 166s body=body, 166s headers=headers, 166s chunked=chunked, 166s preload_content=preload_content, 166s decode_content=decode_content, 166s enforce_content_length=enforce_content_length, 166s ) 166s 166s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 166s # legitimately able to close the connection after sending a valid response. 166s # With this behaviour, the received response is still readable. 166s except BrokenPipeError: 166s pass 166s except OSError as e: 166s # MacOS/Linux 166s # EPROTOTYPE and ECONNRESET are needed on macOS 166s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 166s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 166s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 166s raise 166s 166s # Reset the timeout for the recv() on the socket 166s read_timeout = timeout_obj.read_timeout 166s 166s if not conn.is_closed: 166s # In Python 3 socket.py will catch EAGAIN and return None when you 166s # try and read into the file pointer created by http.client, which 166s # instead raises a BadStatusLine exception. Instead of catching 166s # the exception and assuming all BadStatusLine exceptions are read 166s # timeouts, check for a zero timeout before making the request. 166s if read_timeout == 0: 166s raise ReadTimeoutError( 166s self, url, f"Read timed out. (read timeout={read_timeout})" 166s ) 166s conn.timeout = read_timeout 166s 166s # Receive the response from the server 166s try: 166s response = conn.getresponse() 166s except (BaseSSLError, OSError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 166s raise 166s 166s # Set properties that are used by the pooling layer. 166s response.retries = retries 166s response._connection = response_conn # type: ignore[attr-defined] 166s response._pool = self # type: ignore[attr-defined] 166s 166s log.debug( 166s '%s://%s:%s "%s %s %s" %s %s', 166s self.scheme, 166s self.host, 166s self.port, 166s method, 166s url, 166s > response.version_string, 166s response.status, 166s response.length_remaining, 166s ) 166s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 166s 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 166s ----------------------------- Captured stderr call ----------------------------- 166s 127.0.0.1 - - [18/Jan/2025 03:25:36] "GET /bytes/1024 HTTP/1.1" 200 1024 166s _______________________________ test_auth[http] ________________________________ 166s 166s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_auth_http_0') 166s httpbin_both = 166s verify_pool_mgr = 166s 166s def test_auth(tmpdir, httpbin_both, verify_pool_mgr): 166s """Ensure that we can handle basic auth""" 166s auth = ("user", "passwd") 166s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 166s url = httpbin_both.url + "/basic-auth/user/passwd" 166s with vcr.use_cassette(str(tmpdir.join("auth.yaml"))): 166s > one = verify_pool_mgr.request("GET", url, headers=headers) 166s 166s tests/integration/test_urllib3.py:67: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 166s return self.request_encode_url( 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 166s return self.urlopen(method, url, **extra_kw) 166s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 166s response = conn.urlopen(method, u.request_uri, **kw) 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 166s response = self._make_request( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s self = 166s conn = 166s method = 'GET', url = '/basic-auth/user/passwd', body = None 166s headers = {'authorization': 'Basic dXNlcjpwYXNzd2Q='} 166s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 166s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 166s chunked = False, response_conn = None, preload_content = True 166s decode_content = True, enforce_content_length = True 166s 166s def _make_request( 166s self, 166s conn: BaseHTTPConnection, 166s method: str, 166s url: str, 166s body: _TYPE_BODY | None = None, 166s headers: typing.Mapping[str, str] | None = None, 166s retries: Retry | None = None, 166s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 166s chunked: bool = False, 166s response_conn: BaseHTTPConnection | None = None, 166s preload_content: bool = True, 166s decode_content: bool = True, 166s enforce_content_length: bool = True, 166s ) -> BaseHTTPResponse: 166s """ 166s Perform a request on a given urllib connection object taken from our 166s pool. 166s 166s :param conn: 166s a connection from one of our connection pools 166s 166s :param method: 166s HTTP request method (such as GET, POST, PUT, etc.) 166s 166s :param url: 166s The URL to perform the request on. 166s 166s :param body: 166s Data to send in the request body, either :class:`str`, :class:`bytes`, 166s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 166s 166s :param headers: 166s Dictionary of custom headers to send, such as User-Agent, 166s If-None-Match, etc. If None, pool headers are used. If provided, 166s these headers completely replace any pool-specific headers. 166s 166s :param retries: 166s Configure the number of retries to allow before raising a 166s :class:`~urllib3.exceptions.MaxRetryError` exception. 166s 166s Pass ``None`` to retry until you receive a response. Pass a 166s :class:`~urllib3.util.retry.Retry` object for fine-grained control 166s over different types of retries. 166s Pass an integer number to retry connection errors that many times, 166s but no other types of errors. Pass zero to never retry. 166s 166s If ``False``, then retries are disabled and any exception is raised 166s immediately. Also, instead of raising a MaxRetryError on redirects, 166s the redirect response will be returned. 166s 166s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 166s 166s :param timeout: 166s If specified, overrides the default timeout for this one 166s request. It may be a float (in seconds) or an instance of 166s :class:`urllib3.util.Timeout`. 166s 166s :param chunked: 166s If True, urllib3 will send the body using chunked transfer 166s encoding. Otherwise, urllib3 will send the body using the standard 166s content-length form. Defaults to False. 166s 166s :param response_conn: 166s Set this to ``None`` if you will handle releasing the connection or 166s set the connection to have the response release it. 166s 166s :param preload_content: 166s If True, the response's body will be preloaded during construction. 166s 166s :param decode_content: 166s If True, will attempt to decode the body based on the 166s 'content-encoding' header. 166s 166s :param enforce_content_length: 166s Enforce content length checking. Body returned by server must match 166s value of Content-Length header, if present. Otherwise, raise error. 166s """ 166s self.num_requests += 1 166s 166s timeout_obj = self._get_timeout(timeout) 166s timeout_obj.start_connect() 166s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 166s 166s try: 166s # Trigger any extra validation we need to do. 166s try: 166s self._validate_conn(conn) 166s except (SocketTimeout, BaseSSLError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 166s raise 166s 166s # _validate_conn() starts the connection to an HTTPS proxy 166s # so we need to wrap errors with 'ProxyError' here too. 166s except ( 166s OSError, 166s NewConnectionError, 166s TimeoutError, 166s BaseSSLError, 166s CertificateError, 166s SSLError, 166s ) as e: 166s new_e: Exception = e 166s if isinstance(e, (BaseSSLError, CertificateError)): 166s new_e = SSLError(e) 166s # If the connection didn't successfully connect to it's proxy 166s # then there 166s if isinstance( 166s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 166s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 166s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 166s raise new_e 166s 166s # conn.request() calls http.client.*.request, not the method in 166s # urllib3.request. It also calls makefile (recv) on the socket. 166s try: 166s conn.request( 166s method, 166s url, 166s body=body, 166s headers=headers, 166s chunked=chunked, 166s preload_content=preload_content, 166s decode_content=decode_content, 166s enforce_content_length=enforce_content_length, 166s ) 166s 166s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 166s # legitimately able to close the connection after sending a valid response. 166s # With this behaviour, the received response is still readable. 166s except BrokenPipeError: 166s pass 166s except OSError as e: 166s # MacOS/Linux 166s # EPROTOTYPE and ECONNRESET are needed on macOS 166s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 166s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 166s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 166s raise 166s 166s # Reset the timeout for the recv() on the socket 166s read_timeout = timeout_obj.read_timeout 166s 166s if not conn.is_closed: 166s # In Python 3 socket.py will catch EAGAIN and return None when you 166s # try and read into the file pointer created by http.client, which 166s # instead raises a BadStatusLine exception. Instead of catching 166s # the exception and assuming all BadStatusLine exceptions are read 166s # timeouts, check for a zero timeout before making the request. 166s if read_timeout == 0: 166s raise ReadTimeoutError( 166s self, url, f"Read timed out. (read timeout={read_timeout})" 166s ) 166s conn.timeout = read_timeout 166s 166s # Receive the response from the server 166s try: 166s response = conn.getresponse() 166s except (BaseSSLError, OSError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 166s raise 166s 166s # Set properties that are used by the pooling layer. 166s response.retries = retries 166s response._connection = response_conn # type: ignore[attr-defined] 166s response._pool = self # type: ignore[attr-defined] 166s 166s log.debug( 166s '%s://%s:%s "%s %s %s" %s %s', 166s self.scheme, 166s self.host, 166s self.port, 166s method, 166s url, 166s > response.version_string, 166s response.status, 166s response.length_remaining, 166s ) 166s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 166s 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 166s ----------------------------- Captured stderr call ----------------------------- 166s 127.0.0.1 - - [18/Jan/2025 03:25:36] "GET /basic-auth/user/passwd HTTP/1.1" 200 46 166s ____________________________ test_auth_failed[http] ____________________________ 166s 166s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_auth_failed_http_0') 166s httpbin_both = 166s verify_pool_mgr = 166s 166s def test_auth_failed(tmpdir, httpbin_both, verify_pool_mgr): 166s """Ensure that we can save failed auth statuses""" 166s auth = ("user", "wrongwrongwrong") 166s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 166s url = httpbin_both.url + "/basic-auth/user/passwd" 166s with vcr.use_cassette(str(tmpdir.join("auth-failed.yaml"))) as cass: 166s # Ensure that this is empty to begin with 166s assert_cassette_empty(cass) 166s > one = verify_pool_mgr.request("GET", url, headers=headers) 166s 166s tests/integration/test_urllib3.py:83: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 166s return self.request_encode_url( 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 166s return self.urlopen(method, url, **extra_kw) 166s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 166s response = conn.urlopen(method, u.request_uri, **kw) 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 166s response = self._make_request( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s self = 166s conn = 166s method = 'GET', url = '/basic-auth/user/passwd', body = None 166s headers = {'authorization': 'Basic dXNlcjp3cm9uZ3dyb25nd3Jvbmc='} 166s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 166s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 166s chunked = False, response_conn = None, preload_content = True 166s decode_content = True, enforce_content_length = True 166s 166s def _make_request( 166s self, 166s conn: BaseHTTPConnection, 166s method: str, 166s url: str, 166s body: _TYPE_BODY | None = None, 166s headers: typing.Mapping[str, str] | None = None, 166s retries: Retry | None = None, 166s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 166s chunked: bool = False, 166s response_conn: BaseHTTPConnection | None = None, 166s preload_content: bool = True, 166s decode_content: bool = True, 166s enforce_content_length: bool = True, 166s ) -> BaseHTTPResponse: 166s """ 166s Perform a request on a given urllib connection object taken from our 166s pool. 166s 166s :param conn: 166s a connection from one of our connection pools 166s 166s :param method: 166s HTTP request method (such as GET, POST, PUT, etc.) 166s 166s :param url: 166s The URL to perform the request on. 166s 166s :param body: 166s Data to send in the request body, either :class:`str`, :class:`bytes`, 166s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 166s 166s :param headers: 166s Dictionary of custom headers to send, such as User-Agent, 166s If-None-Match, etc. If None, pool headers are used. If provided, 166s these headers completely replace any pool-specific headers. 166s 166s :param retries: 166s Configure the number of retries to allow before raising a 166s :class:`~urllib3.exceptions.MaxRetryError` exception. 166s 166s Pass ``None`` to retry until you receive a response. Pass a 166s :class:`~urllib3.util.retry.Retry` object for fine-grained control 166s over different types of retries. 166s Pass an integer number to retry connection errors that many times, 166s but no other types of errors. Pass zero to never retry. 166s 166s If ``False``, then retries are disabled and any exception is raised 166s immediately. Also, instead of raising a MaxRetryError on redirects, 166s the redirect response will be returned. 166s 166s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 166s 166s :param timeout: 166s If specified, overrides the default timeout for this one 166s request. It may be a float (in seconds) or an instance of 166s :class:`urllib3.util.Timeout`. 166s 166s :param chunked: 166s If True, urllib3 will send the body using chunked transfer 166s encoding. Otherwise, urllib3 will send the body using the standard 166s content-length form. Defaults to False. 166s 166s :param response_conn: 166s Set this to ``None`` if you will handle releasing the connection or 166s set the connection to have the response release it. 166s 166s :param preload_content: 166s If True, the response's body will be preloaded during construction. 166s 166s :param decode_content: 166s If True, will attempt to decode the body based on the 166s 'content-encoding' header. 166s 166s :param enforce_content_length: 166s Enforce content length checking. Body returned by server must match 166s value of Content-Length header, if present. Otherwise, raise error. 166s """ 166s self.num_requests += 1 166s 166s timeout_obj = self._get_timeout(timeout) 166s timeout_obj.start_connect() 166s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 166s 166s try: 166s # Trigger any extra validation we need to do. 166s try: 166s self._validate_conn(conn) 166s except (SocketTimeout, BaseSSLError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 166s raise 166s 166s # _validate_conn() starts the connection to an HTTPS proxy 166s # so we need to wrap errors with 'ProxyError' here too. 166s except ( 166s OSError, 166s NewConnectionError, 166s TimeoutError, 166s BaseSSLError, 166s CertificateError, 166s SSLError, 166s ) as e: 166s new_e: Exception = e 166s if isinstance(e, (BaseSSLError, CertificateError)): 166s new_e = SSLError(e) 166s # If the connection didn't successfully connect to it's proxy 166s # then there 166s if isinstance( 166s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 166s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 166s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 166s raise new_e 166s 166s # conn.request() calls http.client.*.request, not the method in 166s # urllib3.request. It also calls makefile (recv) on the socket. 166s try: 166s conn.request( 166s method, 166s url, 166s body=body, 166s headers=headers, 166s chunked=chunked, 166s preload_content=preload_content, 166s decode_content=decode_content, 166s enforce_content_length=enforce_content_length, 166s ) 166s 166s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 166s # legitimately able to close the connection after sending a valid response. 166s # With this behaviour, the received response is still readable. 166s except BrokenPipeError: 166s pass 166s except OSError as e: 166s # MacOS/Linux 166s # EPROTOTYPE and ECONNRESET are needed on macOS 166s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 166s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 166s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 166s raise 166s 166s # Reset the timeout for the recv() on the socket 166s read_timeout = timeout_obj.read_timeout 166s 166s if not conn.is_closed: 166s # In Python 3 socket.py will catch EAGAIN and return None when you 166s # try and read into the file pointer created by http.client, which 166s # instead raises a BadStatusLine exception. Instead of catching 166s # the exception and assuming all BadStatusLine exceptions are read 166s # timeouts, check for a zero timeout before making the request. 166s if read_timeout == 0: 166s raise ReadTimeoutError( 166s self, url, f"Read timed out. (read timeout={read_timeout})" 166s ) 166s conn.timeout = read_timeout 166s 166s # Receive the response from the server 166s try: 166s response = conn.getresponse() 166s except (BaseSSLError, OSError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 166s raise 166s 166s # Set properties that are used by the pooling layer. 166s response.retries = retries 166s response._connection = response_conn # type: ignore[attr-defined] 166s response._pool = self # type: ignore[attr-defined] 166s 166s log.debug( 166s '%s://%s:%s "%s %s %s" %s %s', 166s self.scheme, 166s self.host, 166s self.port, 166s method, 166s url, 166s > response.version_string, 166s response.status, 166s response.length_remaining, 166s ) 166s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 166s 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 166s ----------------------------- Captured stderr call ----------------------------- 166s 127.0.0.1 - - [18/Jan/2025 03:25:37] "GET /basic-auth/user/passwd HTTP/1.1" 401 0 166s _______________________________ test_post[http] ________________________________ 166s 166s self = 166s method = 'POST', url = '/post', body = {'key1': 'value1', 'key2': 'value2'} 166s headers = HTTPHeaderDict({}) 166s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 166s redirect = False, assert_same_host = False, timeout = <_TYPE_DEFAULT.token: -1> 166s pool_timeout = None, release_conn = True, chunked = False, body_pos = None 166s preload_content = True, decode_content = True, response_kw = {} 166s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/post', query=None, fragment=None) 166s destination_scheme = None, conn = None, release_this_conn = True 166s http_tunnel_required = False, err = None, clean_exit = False 166s 166s def urlopen( # type: ignore[override] 166s self, 166s method: str, 166s url: str, 166s body: _TYPE_BODY | None = None, 166s headers: typing.Mapping[str, str] | None = None, 166s retries: Retry | bool | int | None = None, 166s redirect: bool = True, 166s assert_same_host: bool = True, 166s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 166s pool_timeout: int | None = None, 166s release_conn: bool | None = None, 166s chunked: bool = False, 166s body_pos: _TYPE_BODY_POSITION | None = None, 166s preload_content: bool = True, 166s decode_content: bool = True, 166s **response_kw: typing.Any, 166s ) -> BaseHTTPResponse: 166s """ 166s Get a connection from the pool and perform an HTTP request. This is the 166s lowest level call for making a request, so you'll need to specify all 166s the raw details. 166s 166s .. note:: 166s 166s More commonly, it's appropriate to use a convenience method 166s such as :meth:`request`. 166s 166s .. note:: 166s 166s `release_conn` will only behave as expected if 166s `preload_content=False` because we want to make 166s `preload_content=False` the default behaviour someday soon without 166s breaking backwards compatibility. 166s 166s :param method: 166s HTTP request method (such as GET, POST, PUT, etc.) 166s 166s :param url: 166s The URL to perform the request on. 166s 166s :param body: 166s Data to send in the request body, either :class:`str`, :class:`bytes`, 166s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 166s 166s :param headers: 166s Dictionary of custom headers to send, such as User-Agent, 166s If-None-Match, etc. If None, pool headers are used. If provided, 166s these headers completely replace any pool-specific headers. 166s 166s :param retries: 166s Configure the number of retries to allow before raising a 166s :class:`~urllib3.exceptions.MaxRetryError` exception. 166s 166s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 166s :class:`~urllib3.util.retry.Retry` object for fine-grained control 166s over different types of retries. 166s Pass an integer number to retry connection errors that many times, 166s but no other types of errors. Pass zero to never retry. 166s 166s If ``False``, then retries are disabled and any exception is raised 166s immediately. Also, instead of raising a MaxRetryError on redirects, 166s the redirect response will be returned. 166s 166s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 166s 166s :param redirect: 166s If True, automatically handle redirects (status codes 301, 302, 166s 303, 307, 308). Each redirect counts as a retry. Disabling retries 166s will disable redirect, too. 166s 166s :param assert_same_host: 166s If ``True``, will make sure that the host of the pool requests is 166s consistent else will raise HostChangedError. When ``False``, you can 166s use the pool on an HTTP proxy and request foreign hosts. 166s 166s :param timeout: 166s If specified, overrides the default timeout for this one 166s request. It may be a float (in seconds) or an instance of 166s :class:`urllib3.util.Timeout`. 166s 166s :param pool_timeout: 166s If set and the pool is set to block=True, then this method will 166s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 166s connection is available within the time period. 166s 166s :param bool preload_content: 166s If True, the response's body will be preloaded into memory. 166s 166s :param bool decode_content: 166s If True, will attempt to decode the body based on the 166s 'content-encoding' header. 166s 166s :param release_conn: 166s If False, then the urlopen call will not release the connection 166s back into the pool once a response is received (but will release if 166s you read the entire contents of the response such as when 166s `preload_content=True`). This is useful if you're not preloading 166s the response's content immediately. You will need to call 166s ``r.release_conn()`` on the response ``r`` to return the connection 166s back into the pool. If None, it takes the value of ``preload_content`` 166s which defaults to ``True``. 166s 166s :param bool chunked: 166s If True, urllib3 will send the body using chunked transfer 166s encoding. Otherwise, urllib3 will send the body using the standard 166s content-length form. Defaults to False. 166s 166s :param int body_pos: 166s Position to seek to in file-like body in the event of a retry or 166s redirect. Typically this won't need to be set because urllib3 will 166s auto-populate the value when needed. 166s """ 166s parsed_url = parse_url(url) 166s destination_scheme = parsed_url.scheme 166s 166s if headers is None: 166s headers = self.headers 166s 166s if not isinstance(retries, Retry): 166s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 166s 166s if release_conn is None: 166s release_conn = preload_content 166s 166s # Check host 166s if assert_same_host and not self.is_same_host(url): 166s raise HostChangedError(self, url, retries) 166s 166s # Ensure that the URL we're connecting to is properly encoded 166s if url.startswith("/"): 166s url = to_str(_encode_target(url)) 166s else: 166s url = to_str(parsed_url.url) 166s 166s conn = None 166s 166s # Track whether `conn` needs to be released before 166s # returning/raising/recursing. Update this variable if necessary, and 166s # leave `release_conn` constant throughout the function. That way, if 166s # the function recurses, the original value of `release_conn` will be 166s # passed down into the recursive call, and its value will be respected. 166s # 166s # See issue #651 [1] for details. 166s # 166s # [1] 166s release_this_conn = release_conn 166s 166s http_tunnel_required = connection_requires_http_tunnel( 166s self.proxy, self.proxy_config, destination_scheme 166s ) 166s 166s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 166s # have to copy the headers dict so we can safely change it without those 166s # changes being reflected in anyone else's copy. 166s if not http_tunnel_required: 166s headers = headers.copy() # type: ignore[attr-defined] 166s headers.update(self.proxy_headers) # type: ignore[union-attr] 166s 166s # Must keep the exception bound to a separate variable or else Python 3 166s # complains about UnboundLocalError. 166s err = None 166s 166s # Keep track of whether we cleanly exited the except block. This 166s # ensures we do proper cleanup in finally. 166s clean_exit = False 166s 166s # Rewind body position, if needed. Record current position 166s # for future rewinds in the event of a redirect/retry. 166s body_pos = set_file_position(body, body_pos) 166s 166s try: 166s # Request a connection from the queue. 166s timeout_obj = self._get_timeout(timeout) 166s conn = self._get_conn(timeout=pool_timeout) 166s 166s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 166s 166s # Is this a closed/new connection that requires CONNECT tunnelling? 166s if self.proxy is not None and http_tunnel_required and conn.is_closed: 166s try: 166s self._prepare_proxy(conn) 166s except (BaseSSLError, OSError, SocketTimeout) as e: 166s self._raise_timeout( 166s err=e, url=self.proxy.url, timeout_value=conn.timeout 166s ) 166s raise 166s 166s # If we're going to release the connection in ``finally:``, then 166s # the response doesn't need to know about the connection. Otherwise 166s # it will also try to release it and we'll have a double-release 166s # mess. 166s response_conn = conn if not release_conn else None 166s 166s # Make the request on the HTTPConnection object 166s > response = self._make_request( 166s conn, 166s method, 166s url, 166s timeout=timeout_obj, 166s body=body, 166s headers=headers, 166s chunked=chunked, 166s retries=retries, 166s response_conn=response_conn, 166s preload_content=preload_content, 166s decode_content=decode_content, 166s **response_kw, 166s ) 166s 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 166s response = conn.getresponse() 166s /usr/lib/python3/dist-packages/vcr/stubs/__init__.py:277: in getresponse 166s self.real_connection.request( 166s /usr/lib/python3/dist-packages/urllib3/connection.py:457: in request 166s self.send(b"%x\r\n%b\r\n" % (len(chunk), chunk)) 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s self = 166s data = b'4\r\nkey2\r\n' 166s 166s def send(self, data): 166s """Send `data' to the server. 166s ``data`` can be a string object, a bytes object, an array object, a 166s file-like object that supports a .read() method, or an iterable object. 166s """ 166s 166s if self.sock is None: 166s if self.auto_open: 166s self.connect() 166s else: 166s raise NotConnected() 166s 166s if self.debuglevel > 0: 166s print("send:", repr(data)) 166s if hasattr(data, "read") : 166s if self.debuglevel > 0: 166s print("sending a readable") 166s encode = self._is_textIO(data) 166s if encode and self.debuglevel > 0: 166s print("encoding file using iso-8859-1") 166s while datablock := data.read(self.blocksize): 166s if encode: 166s datablock = datablock.encode("iso-8859-1") 166s sys.audit("http.client.send", self, datablock) 166s self.sock.sendall(datablock) 166s return 166s sys.audit("http.client.send", self, data) 166s try: 166s > self.sock.sendall(data) 166s E BrokenPipeError: [Errno 32] Broken pipe 166s 166s /usr/lib/python3.13/http/client.py:1055: BrokenPipeError 166s 166s During handling of the above exception, another exception occurred: 166s 166s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_post_http_0') 166s httpbin_both = 166s verify_pool_mgr = 166s 166s def test_post(tmpdir, httpbin_both, verify_pool_mgr): 166s """Ensure that we can post and cache the results""" 166s data = {"key1": "value1", "key2": "value2"} 166s url = httpbin_both.url + "/post" 166s with vcr.use_cassette(str(tmpdir.join("verify_pool_mgr.yaml"))): 166s > req1 = verify_pool_mgr.request("POST", url, data).data 166s 166s tests/integration/test_urllib3.py:94: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:143: in request 166s return self.request_encode_body( 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:278: in request_encode_body 166s return self.urlopen(method, url, **extra_kw) 166s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 166s response = conn.urlopen(method, u.request_uri, **kw) 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 166s retries = retries.increment( 166s /usr/lib/python3/dist-packages/urllib3/util/retry.py:474: in increment 166s raise reraise(type(error), error, _stacktrace) 166s /usr/lib/python3/dist-packages/urllib3/util/util.py:38: in reraise 166s raise value.with_traceback(tb) 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 166s response = self._make_request( 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 166s response = conn.getresponse() 166s /usr/lib/python3/dist-packages/vcr/stubs/__init__.py:277: in getresponse 166s self.real_connection.request( 166s /usr/lib/python3/dist-packages/urllib3/connection.py:457: in request 166s self.send(b"%x\r\n%b\r\n" % (len(chunk), chunk)) 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s self = 166s data = b'4\r\nkey2\r\n' 166s 166s def send(self, data): 166s """Send `data' to the server. 166s ``data`` can be a string object, a bytes object, an array object, a 166s file-like object that supports a .read() method, or an iterable object. 166s """ 166s 166s if self.sock is None: 166s if self.auto_open: 166s self.connect() 166s else: 166s raise NotConnected() 166s 166s if self.debuglevel > 0: 166s print("send:", repr(data)) 166s if hasattr(data, "read") : 166s if self.debuglevel > 0: 166s print("sending a readable") 166s encode = self._is_textIO(data) 166s if encode and self.debuglevel > 0: 166s print("encoding file using iso-8859-1") 166s while datablock := data.read(self.blocksize): 166s if encode: 166s datablock = datablock.encode("iso-8859-1") 166s sys.audit("http.client.send", self, datablock) 166s self.sock.sendall(datablock) 166s return 166s sys.audit("http.client.send", self, data) 166s try: 166s > self.sock.sendall(data) 166s E urllib3.exceptions.ProtocolError: ('Connection aborted.', BrokenPipeError(32, 'Broken pipe')) 166s 166s /usr/lib/python3.13/http/client.py:1055: ProtocolError 166s ----------------------------- Captured stderr call ----------------------------- 166s 127.0.0.1 - - [18/Jan/2025 03:25:37] "POST /post HTTP/1.1" 501 159 166s _______________________________ test_gzip[http] ________________________________ 166s 166s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_gzip_http_0') 166s httpbin_both = 166s verify_pool_mgr = 166s 166s def test_gzip(tmpdir, httpbin_both, verify_pool_mgr): 166s """ 166s Ensure that requests (actually urllib3) is able to automatically decompress 166s the response body 166s """ 166s url = httpbin_both.url + "/gzip" 166s response = verify_pool_mgr.request("GET", url) 166s 166s with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))): 166s > response = verify_pool_mgr.request("GET", url) 166s 166s tests/integration/test_urllib3.py:140: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 166s return self.request_encode_url( 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 166s return self.urlopen(method, url, **extra_kw) 166s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 166s response = conn.urlopen(method, u.request_uri, **kw) 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 166s response = self._make_request( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s self = 166s conn = 166s method = 'GET', url = '/gzip', body = None, headers = {} 166s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 166s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 166s chunked = False, response_conn = None, preload_content = True 166s decode_content = True, enforce_content_length = True 166s 166s def _make_request( 166s self, 166s conn: BaseHTTPConnection, 166s method: str, 166s url: str, 166s body: _TYPE_BODY | None = None, 166s headers: typing.Mapping[str, str] | None = None, 166s retries: Retry | None = None, 166s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 166s chunked: bool = False, 166s response_conn: BaseHTTPConnection | None = None, 166s preload_content: bool = True, 166s decode_content: bool = True, 166s enforce_content_length: bool = True, 166s ) -> BaseHTTPResponse: 166s """ 166s Perform a request on a given urllib connection object taken from our 166s pool. 166s 166s :param conn: 166s a connection from one of our connection pools 166s 166s :param method: 166s HTTP request method (such as GET, POST, PUT, etc.) 166s 166s :param url: 166s The URL to perform the request on. 166s 166s :param body: 166s Data to send in the request body, either :class:`str`, :class:`bytes`, 166s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 166s 166s :param headers: 166s Dictionary of custom headers to send, such as User-Agent, 166s If-None-Match, etc. If None, pool headers are used. If provided, 166s these headers completely replace any pool-specific headers. 166s 166s :param retries: 166s Configure the number of retries to allow before raising a 166s :class:`~urllib3.exceptions.MaxRetryError` exception. 166s 166s Pass ``None`` to retry until you receive a response. Pass a 166s :class:`~urllib3.util.retry.Retry` object for fine-grained control 166s over different types of retries. 166s Pass an integer number to retry connection errors that many times, 166s but no other types of errors. Pass zero to never retry. 166s 166s If ``False``, then retries are disabled and any exception is raised 166s immediately. Also, instead of raising a MaxRetryError on redirects, 166s the redirect response will be returned. 166s 166s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 166s 166s :param timeout: 166s If specified, overrides the default timeout for this one 166s request. It may be a float (in seconds) or an instance of 166s :class:`urllib3.util.Timeout`. 166s 166s :param chunked: 166s If True, urllib3 will send the body using chunked transfer 166s encoding. Otherwise, urllib3 will send the body using the standard 166s content-length form. Defaults to False. 166s 166s :param response_conn: 166s Set this to ``None`` if you will handle releasing the connection or 166s set the connection to have the response release it. 166s 166s :param preload_content: 166s If True, the response's body will be preloaded during construction. 166s 166s :param decode_content: 166s If True, will attempt to decode the body based on the 166s 'content-encoding' header. 166s 166s :param enforce_content_length: 166s Enforce content length checking. Body returned by server must match 166s value of Content-Length header, if present. Otherwise, raise error. 166s """ 166s self.num_requests += 1 166s 166s timeout_obj = self._get_timeout(timeout) 166s timeout_obj.start_connect() 166s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 166s 166s try: 166s # Trigger any extra validation we need to do. 166s try: 166s self._validate_conn(conn) 166s except (SocketTimeout, BaseSSLError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 166s raise 166s 166s # _validate_conn() starts the connection to an HTTPS proxy 166s # so we need to wrap errors with 'ProxyError' here too. 166s except ( 166s OSError, 166s NewConnectionError, 166s TimeoutError, 166s BaseSSLError, 166s CertificateError, 166s SSLError, 166s ) as e: 166s new_e: Exception = e 166s if isinstance(e, (BaseSSLError, CertificateError)): 166s new_e = SSLError(e) 166s # If the connection didn't successfully connect to it's proxy 166s # then there 166s if isinstance( 166s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 166s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 166s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 166s raise new_e 166s 166s # conn.request() calls http.client.*.request, not the method in 166s # urllib3.request. It also calls makefile (recv) on the socket. 166s try: 166s conn.request( 166s method, 166s url, 166s body=body, 166s headers=headers, 166s chunked=chunked, 166s preload_content=preload_content, 166s decode_content=decode_content, 166s enforce_content_length=enforce_content_length, 166s ) 166s 166s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 166s # legitimately able to close the connection after sending a valid response. 166s # With this behaviour, the received response is still readable. 166s except BrokenPipeError: 166s pass 166s except OSError as e: 166s # MacOS/Linux 166s # EPROTOTYPE and ECONNRESET are needed on macOS 166s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 166s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 166s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 166s raise 166s 166s # Reset the timeout for the recv() on the socket 166s read_timeout = timeout_obj.read_timeout 166s 166s if not conn.is_closed: 166s # In Python 3 socket.py will catch EAGAIN and return None when you 166s # try and read into the file pointer created by http.client, which 166s # instead raises a BadStatusLine exception. Instead of catching 166s # the exception and assuming all BadStatusLine exceptions are read 166s # timeouts, check for a zero timeout before making the request. 166s if read_timeout == 0: 166s raise ReadTimeoutError( 166s self, url, f"Read timed out. (read timeout={read_timeout})" 166s ) 166s conn.timeout = read_timeout 166s 166s # Receive the response from the server 166s try: 166s response = conn.getresponse() 166s except (BaseSSLError, OSError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 166s raise 166s 166s # Set properties that are used by the pooling layer. 166s response.retries = retries 166s response._connection = response_conn # type: ignore[attr-defined] 166s response._pool = self # type: ignore[attr-defined] 166s 166s log.debug( 166s '%s://%s:%s "%s %s %s" %s %s', 166s self.scheme, 166s self.host, 166s self.port, 166s method, 166s url, 166s > response.version_string, 166s response.status, 166s response.length_remaining, 166s ) 166s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 166s 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 166s ----------------------------- Captured stderr call ----------------------------- 166s 127.0.0.1 - - [18/Jan/2025 03:25:37] "GET /gzip HTTP/1.1" 200 164 166s 127.0.0.1 - - [18/Jan/2025 03:25:37] "GET /gzip HTTP/1.1" 200 164 166s ___________________________ test_status_code[https] ____________________________ 166s 166s httpbin_both = 166s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_status_code_https_0') 166s verify_pool_mgr = 166s 166s def test_status_code(httpbin_both, tmpdir, verify_pool_mgr): 166s """Ensure that we can read the status code""" 166s url = httpbin_both.url 166s with vcr.use_cassette(str(tmpdir.join("atts.yaml"))): 166s > status_code = verify_pool_mgr.request("GET", url).status 166s 166s tests/integration/test_urllib3.py:34: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 166s return self.request_encode_url( 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 166s return self.urlopen(method, url, **extra_kw) 166s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 166s response = conn.urlopen(method, u.request_uri, **kw) 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 166s response = self._make_request( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s self = 166s conn = 166s method = 'GET', url = '/', body = None, headers = {} 166s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 166s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 166s chunked = False, response_conn = None, preload_content = True 166s decode_content = True, enforce_content_length = True 166s 166s def _make_request( 166s self, 166s conn: BaseHTTPConnection, 166s method: str, 166s url: str, 166s body: _TYPE_BODY | None = None, 166s headers: typing.Mapping[str, str] | None = None, 166s retries: Retry | None = None, 166s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 166s chunked: bool = False, 166s response_conn: BaseHTTPConnection | None = None, 166s preload_content: bool = True, 166s decode_content: bool = True, 166s enforce_content_length: bool = True, 166s ) -> BaseHTTPResponse: 166s """ 166s Perform a request on a given urllib connection object taken from our 166s pool. 166s 166s :param conn: 166s a connection from one of our connection pools 166s 166s :param method: 166s HTTP request method (such as GET, POST, PUT, etc.) 166s 166s :param url: 166s The URL to perform the request on. 166s 166s :param body: 166s Data to send in the request body, either :class:`str`, :class:`bytes`, 166s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 166s 166s :param headers: 166s Dictionary of custom headers to send, such as User-Agent, 166s If-None-Match, etc. If None, pool headers are used. If provided, 166s these headers completely replace any pool-specific headers. 166s 166s :param retries: 166s Configure the number of retries to allow before raising a 166s :class:`~urllib3.exceptions.MaxRetryError` exception. 166s 166s Pass ``None`` to retry until you receive a response. Pass a 166s :class:`~urllib3.util.retry.Retry` object for fine-grained control 166s over different types of retries. 166s Pass an integer number to retry connection errors that many times, 166s but no other types of errors. Pass zero to never retry. 166s 166s If ``False``, then retries are disabled and any exception is raised 166s immediately. Also, instead of raising a MaxRetryError on redirects, 166s the redirect response will be returned. 166s 166s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 166s 166s :param timeout: 166s If specified, overrides the default timeout for this one 166s request. It may be a float (in seconds) or an instance of 166s :class:`urllib3.util.Timeout`. 166s 166s :param chunked: 166s If True, urllib3 will send the body using chunked transfer 166s encoding. Otherwise, urllib3 will send the body using the standard 166s content-length form. Defaults to False. 166s 166s :param response_conn: 166s Set this to ``None`` if you will handle releasing the connection or 166s set the connection to have the response release it. 166s 166s :param preload_content: 166s If True, the response's body will be preloaded during construction. 166s 166s :param decode_content: 166s If True, will attempt to decode the body based on the 166s 'content-encoding' header. 166s 166s :param enforce_content_length: 166s Enforce content length checking. Body returned by server must match 166s value of Content-Length header, if present. Otherwise, raise error. 166s """ 166s self.num_requests += 1 166s 166s timeout_obj = self._get_timeout(timeout) 166s timeout_obj.start_connect() 166s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 166s 166s try: 166s # Trigger any extra validation we need to do. 166s try: 166s self._validate_conn(conn) 166s except (SocketTimeout, BaseSSLError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 166s raise 166s 166s # _validate_conn() starts the connection to an HTTPS proxy 166s # so we need to wrap errors with 'ProxyError' here too. 166s except ( 166s OSError, 166s NewConnectionError, 166s TimeoutError, 166s BaseSSLError, 166s CertificateError, 166s SSLError, 166s ) as e: 166s new_e: Exception = e 166s if isinstance(e, (BaseSSLError, CertificateError)): 166s new_e = SSLError(e) 166s # If the connection didn't successfully connect to it's proxy 166s # then there 166s if isinstance( 166s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 166s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 166s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 166s raise new_e 166s 166s # conn.request() calls http.client.*.request, not the method in 166s # urllib3.request. It also calls makefile (recv) on the socket. 166s try: 166s conn.request( 166s method, 166s url, 166s body=body, 166s headers=headers, 166s chunked=chunked, 166s preload_content=preload_content, 166s decode_content=decode_content, 166s enforce_content_length=enforce_content_length, 166s ) 166s 166s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 166s # legitimately able to close the connection after sending a valid response. 166s # With this behaviour, the received response is still readable. 166s except BrokenPipeError: 166s pass 166s except OSError as e: 166s # MacOS/Linux 166s # EPROTOTYPE and ECONNRESET are needed on macOS 166s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 166s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 166s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 166s raise 166s 166s # Reset the timeout for the recv() on the socket 166s read_timeout = timeout_obj.read_timeout 166s 166s if not conn.is_closed: 166s # In Python 3 socket.py will catch EAGAIN and return None when you 166s # try and read into the file pointer created by http.client, which 166s # instead raises a BadStatusLine exception. Instead of catching 166s # the exception and assuming all BadStatusLine exceptions are read 166s # timeouts, check for a zero timeout before making the request. 166s if read_timeout == 0: 166s raise ReadTimeoutError( 166s self, url, f"Read timed out. (read timeout={read_timeout})" 166s ) 166s conn.timeout = read_timeout 166s 166s # Receive the response from the server 166s try: 166s response = conn.getresponse() 166s except (BaseSSLError, OSError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 166s raise 166s 166s # Set properties that are used by the pooling layer. 166s response.retries = retries 166s response._connection = response_conn # type: ignore[attr-defined] 166s response._pool = self # type: ignore[attr-defined] 166s 166s log.debug( 166s '%s://%s:%s "%s %s %s" %s %s', 166s self.scheme, 166s self.host, 166s self.port, 166s method, 166s url, 166s > response.version_string, 166s response.status, 166s response.length_remaining, 166s ) 166s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 166s 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 166s ----------------------------- Captured stderr call ----------------------------- 166s 127.0.0.1 - - [18/Jan/2025 03:25:37] "GET / HTTP/1.1" 200 9358 166s _____________________________ test_headers[https] ______________________________ 166s 166s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_headers_https_0') 166s httpbin_both = 166s verify_pool_mgr = 166s 166s def test_headers(tmpdir, httpbin_both, verify_pool_mgr): 166s """Ensure that we can read the headers back""" 166s url = httpbin_both.url 166s with vcr.use_cassette(str(tmpdir.join("headers.yaml"))): 166s > headers = verify_pool_mgr.request("GET", url).headers 166s 166s tests/integration/test_urllib3.py:44: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 166s return self.request_encode_url( 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 166s return self.urlopen(method, url, **extra_kw) 166s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 166s response = conn.urlopen(method, u.request_uri, **kw) 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 166s response = self._make_request( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s self = 166s conn = 166s method = 'GET', url = '/', body = None, headers = {} 166s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 166s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 166s chunked = False, response_conn = None, preload_content = True 166s decode_content = True, enforce_content_length = True 166s 166s def _make_request( 166s self, 166s conn: BaseHTTPConnection, 166s method: str, 166s url: str, 166s body: _TYPE_BODY | None = None, 166s headers: typing.Mapping[str, str] | None = None, 166s retries: Retry | None = None, 166s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 166s chunked: bool = False, 166s response_conn: BaseHTTPConnection | None = None, 166s preload_content: bool = True, 166s decode_content: bool = True, 166s enforce_content_length: bool = True, 166s ) -> BaseHTTPResponse: 166s """ 166s Perform a request on a given urllib connection object taken from our 166s pool. 166s 166s :param conn: 166s a connection from one of our connection pools 166s 166s :param method: 166s HTTP request method (such as GET, POST, PUT, etc.) 166s 166s :param url: 166s The URL to perform the request on. 166s 166s :param body: 166s Data to send in the request body, either :class:`str`, :class:`bytes`, 166s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 166s 166s :param headers: 166s Dictionary of custom headers to send, such as User-Agent, 166s If-None-Match, etc. If None, pool headers are used. If provided, 166s these headers completely replace any pool-specific headers. 166s 166s :param retries: 166s Configure the number of retries to allow before raising a 166s :class:`~urllib3.exceptions.MaxRetryError` exception. 166s 166s Pass ``None`` to retry until you receive a response. Pass a 166s :class:`~urllib3.util.retry.Retry` object for fine-grained control 166s over different types of retries. 166s Pass an integer number to retry connection errors that many times, 166s but no other types of errors. Pass zero to never retry. 166s 166s If ``False``, then retries are disabled and any exception is raised 166s immediately. Also, instead of raising a MaxRetryError on redirects, 166s the redirect response will be returned. 166s 166s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 166s 166s :param timeout: 166s If specified, overrides the default timeout for this one 166s request. It may be a float (in seconds) or an instance of 166s :class:`urllib3.util.Timeout`. 166s 166s :param chunked: 166s If True, urllib3 will send the body using chunked transfer 166s encoding. Otherwise, urllib3 will send the body using the standard 166s content-length form. Defaults to False. 166s 166s :param response_conn: 166s Set this to ``None`` if you will handle releasing the connection or 166s set the connection to have the response release it. 166s 166s :param preload_content: 166s If True, the response's body will be preloaded during construction. 166s 166s :param decode_content: 166s If True, will attempt to decode the body based on the 166s 'content-encoding' header. 166s 166s :param enforce_content_length: 166s Enforce content length checking. Body returned by server must match 166s value of Content-Length header, if present. Otherwise, raise error. 166s """ 166s self.num_requests += 1 166s 166s timeout_obj = self._get_timeout(timeout) 166s timeout_obj.start_connect() 166s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 166s 166s try: 166s # Trigger any extra validation we need to do. 166s try: 166s self._validate_conn(conn) 166s except (SocketTimeout, BaseSSLError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 166s raise 166s 166s # _validate_conn() starts the connection to an HTTPS proxy 166s # so we need to wrap errors with 'ProxyError' here too. 166s except ( 166s OSError, 166s NewConnectionError, 166s TimeoutError, 166s BaseSSLError, 166s CertificateError, 166s SSLError, 166s ) as e: 166s new_e: Exception = e 166s if isinstance(e, (BaseSSLError, CertificateError)): 166s new_e = SSLError(e) 166s # If the connection didn't successfully connect to it's proxy 166s # then there 166s if isinstance( 166s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 166s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 166s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 166s raise new_e 166s 166s # conn.request() calls http.client.*.request, not the method in 166s # urllib3.request. It also calls makefile (recv) on the socket. 166s try: 166s conn.request( 166s method, 166s url, 166s body=body, 166s headers=headers, 166s chunked=chunked, 166s preload_content=preload_content, 166s decode_content=decode_content, 166s enforce_content_length=enforce_content_length, 166s ) 166s 166s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 166s # legitimately able to close the connection after sending a valid response. 166s # With this behaviour, the received response is still readable. 166s except BrokenPipeError: 166s pass 166s except OSError as e: 166s # MacOS/Linux 166s # EPROTOTYPE and ECONNRESET are needed on macOS 166s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 166s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 166s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 166s raise 166s 166s # Reset the timeout for the recv() on the socket 166s read_timeout = timeout_obj.read_timeout 166s 166s if not conn.is_closed: 166s # In Python 3 socket.py will catch EAGAIN and return None when you 166s # try and read into the file pointer created by http.client, which 166s # instead raises a BadStatusLine exception. Instead of catching 166s # the exception and assuming all BadStatusLine exceptions are read 166s # timeouts, check for a zero timeout before making the request. 166s if read_timeout == 0: 166s raise ReadTimeoutError( 166s self, url, f"Read timed out. (read timeout={read_timeout})" 166s ) 166s conn.timeout = read_timeout 166s 166s # Receive the response from the server 166s try: 166s response = conn.getresponse() 166s except (BaseSSLError, OSError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 166s raise 166s 166s # Set properties that are used by the pooling layer. 166s response.retries = retries 166s response._connection = response_conn # type: ignore[attr-defined] 166s response._pool = self # type: ignore[attr-defined] 166s 166s log.debug( 166s '%s://%s:%s "%s %s %s" %s %s', 166s self.scheme, 166s self.host, 166s self.port, 166s method, 166s url, 166s > response.version_string, 166s response.status, 166s response.length_remaining, 166s ) 166s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 166s 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 166s ----------------------------- Captured stderr call ----------------------------- 166s 127.0.0.1 - - [18/Jan/2025 03:25:37] "GET / HTTP/1.1" 200 9358 166s _______________________________ test_body[https] _______________________________ 166s 166s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_body_https_0') 166s httpbin_both = 166s verify_pool_mgr = 166s 166s def test_body(tmpdir, httpbin_both, verify_pool_mgr): 166s """Ensure the responses are all identical enough""" 166s url = httpbin_both.url + "/bytes/1024" 166s with vcr.use_cassette(str(tmpdir.join("body.yaml"))): 166s > content = verify_pool_mgr.request("GET", url).data 166s 166s tests/integration/test_urllib3.py:55: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 166s return self.request_encode_url( 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 166s return self.urlopen(method, url, **extra_kw) 166s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 166s response = conn.urlopen(method, u.request_uri, **kw) 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 166s response = self._make_request( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s self = 166s conn = 166s method = 'GET', url = '/bytes/1024', body = None, headers = {} 166s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 166s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 166s chunked = False, response_conn = None, preload_content = True 166s decode_content = True, enforce_content_length = True 166s 166s def _make_request( 166s self, 166s conn: BaseHTTPConnection, 166s method: str, 166s url: str, 166s body: _TYPE_BODY | None = None, 166s headers: typing.Mapping[str, str] | None = None, 166s retries: Retry | None = None, 166s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 166s chunked: bool = False, 166s response_conn: BaseHTTPConnection | None = None, 166s preload_content: bool = True, 166s decode_content: bool = True, 166s enforce_content_length: bool = True, 166s ) -> BaseHTTPResponse: 166s """ 166s Perform a request on a given urllib connection object taken from our 166s pool. 166s 166s :param conn: 166s a connection from one of our connection pools 166s 166s :param method: 166s HTTP request method (such as GET, POST, PUT, etc.) 166s 166s :param url: 166s The URL to perform the request on. 166s 166s :param body: 166s Data to send in the request body, either :class:`str`, :class:`bytes`, 166s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 166s 166s :param headers: 166s Dictionary of custom headers to send, such as User-Agent, 166s If-None-Match, etc. If None, pool headers are used. If provided, 166s these headers completely replace any pool-specific headers. 166s 166s :param retries: 166s Configure the number of retries to allow before raising a 166s :class:`~urllib3.exceptions.MaxRetryError` exception. 166s 166s Pass ``None`` to retry until you receive a response. Pass a 166s :class:`~urllib3.util.retry.Retry` object for fine-grained control 166s over different types of retries. 166s Pass an integer number to retry connection errors that many times, 166s but no other types of errors. Pass zero to never retry. 166s 166s If ``False``, then retries are disabled and any exception is raised 166s immediately. Also, instead of raising a MaxRetryError on redirects, 166s the redirect response will be returned. 166s 166s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 166s 166s :param timeout: 166s If specified, overrides the default timeout for this one 166s request. It may be a float (in seconds) or an instance of 166s :class:`urllib3.util.Timeout`. 166s 166s :param chunked: 166s If True, urllib3 will send the body using chunked transfer 166s encoding. Otherwise, urllib3 will send the body using the standard 166s content-length form. Defaults to False. 166s 166s :param response_conn: 166s Set this to ``None`` if you will handle releasing the connection or 166s set the connection to have the response release it. 166s 166s :param preload_content: 166s If True, the response's body will be preloaded during construction. 166s 166s :param decode_content: 166s If True, will attempt to decode the body based on the 166s 'content-encoding' header. 166s 166s :param enforce_content_length: 166s Enforce content length checking. Body returned by server must match 166s value of Content-Length header, if present. Otherwise, raise error. 166s """ 166s self.num_requests += 1 166s 166s timeout_obj = self._get_timeout(timeout) 166s timeout_obj.start_connect() 166s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 166s 166s try: 166s # Trigger any extra validation we need to do. 166s try: 166s self._validate_conn(conn) 166s except (SocketTimeout, BaseSSLError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 166s raise 166s 166s # _validate_conn() starts the connection to an HTTPS proxy 166s # so we need to wrap errors with 'ProxyError' here too. 166s except ( 166s OSError, 166s NewConnectionError, 166s TimeoutError, 166s BaseSSLError, 166s CertificateError, 166s SSLError, 166s ) as e: 166s new_e: Exception = e 166s if isinstance(e, (BaseSSLError, CertificateError)): 166s new_e = SSLError(e) 166s # If the connection didn't successfully connect to it's proxy 166s # then there 166s if isinstance( 166s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 166s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 166s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 166s raise new_e 166s 166s # conn.request() calls http.client.*.request, not the method in 166s # urllib3.request. It also calls makefile (recv) on the socket. 166s try: 166s conn.request( 166s method, 166s url, 166s body=body, 166s headers=headers, 166s chunked=chunked, 166s preload_content=preload_content, 166s decode_content=decode_content, 166s enforce_content_length=enforce_content_length, 166s ) 166s 166s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 166s # legitimately able to close the connection after sending a valid response. 166s # With this behaviour, the received response is still readable. 166s except BrokenPipeError: 166s pass 166s except OSError as e: 166s # MacOS/Linux 166s # EPROTOTYPE and ECONNRESET are needed on macOS 166s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 166s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 166s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 166s raise 166s 166s # Reset the timeout for the recv() on the socket 166s read_timeout = timeout_obj.read_timeout 166s 166s if not conn.is_closed: 166s # In Python 3 socket.py will catch EAGAIN and return None when you 166s # try and read into the file pointer created by http.client, which 166s # instead raises a BadStatusLine exception. Instead of catching 166s # the exception and assuming all BadStatusLine exceptions are read 166s # timeouts, check for a zero timeout before making the request. 166s if read_timeout == 0: 166s raise ReadTimeoutError( 166s self, url, f"Read timed out. (read timeout={read_timeout})" 166s ) 166s conn.timeout = read_timeout 166s 166s # Receive the response from the server 166s try: 166s response = conn.getresponse() 166s except (BaseSSLError, OSError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 166s raise 166s 166s # Set properties that are used by the pooling layer. 166s response.retries = retries 166s response._connection = response_conn # type: ignore[attr-defined] 166s response._pool = self # type: ignore[attr-defined] 166s 166s log.debug( 166s '%s://%s:%s "%s %s %s" %s %s', 166s self.scheme, 166s self.host, 166s self.port, 166s method, 166s url, 166s > response.version_string, 166s response.status, 166s response.length_remaining, 166s ) 166s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 166s 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 166s ----------------------------- Captured stderr call ----------------------------- 166s 127.0.0.1 - - [18/Jan/2025 03:25:37] "GET /bytes/1024 HTTP/1.1" 200 1024 166s _______________________________ test_auth[https] _______________________________ 166s 166s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_auth_https_0') 166s httpbin_both = 166s verify_pool_mgr = 166s 166s def test_auth(tmpdir, httpbin_both, verify_pool_mgr): 166s """Ensure that we can handle basic auth""" 166s auth = ("user", "passwd") 166s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 166s url = httpbin_both.url + "/basic-auth/user/passwd" 166s with vcr.use_cassette(str(tmpdir.join("auth.yaml"))): 166s > one = verify_pool_mgr.request("GET", url, headers=headers) 166s 166s tests/integration/test_urllib3.py:67: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 166s return self.request_encode_url( 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 166s return self.urlopen(method, url, **extra_kw) 166s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 166s response = conn.urlopen(method, u.request_uri, **kw) 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 166s response = self._make_request( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s self = 166s conn = 166s method = 'GET', url = '/basic-auth/user/passwd', body = None 166s headers = {'authorization': 'Basic dXNlcjpwYXNzd2Q='} 166s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 166s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 166s chunked = False, response_conn = None, preload_content = True 166s decode_content = True, enforce_content_length = True 166s 166s def _make_request( 166s self, 166s conn: BaseHTTPConnection, 166s method: str, 166s url: str, 166s body: _TYPE_BODY | None = None, 166s headers: typing.Mapping[str, str] | None = None, 166s retries: Retry | None = None, 166s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 166s chunked: bool = False, 166s response_conn: BaseHTTPConnection | None = None, 166s preload_content: bool = True, 166s decode_content: bool = True, 166s enforce_content_length: bool = True, 166s ) -> BaseHTTPResponse: 166s """ 166s Perform a request on a given urllib connection object taken from our 166s pool. 166s 166s :param conn: 166s a connection from one of our connection pools 166s 166s :param method: 166s HTTP request method (such as GET, POST, PUT, etc.) 166s 166s :param url: 166s The URL to perform the request on. 166s 166s :param body: 166s Data to send in the request body, either :class:`str`, :class:`bytes`, 166s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 166s 166s :param headers: 166s Dictionary of custom headers to send, such as User-Agent, 166s If-None-Match, etc. If None, pool headers are used. If provided, 166s these headers completely replace any pool-specific headers. 166s 166s :param retries: 166s Configure the number of retries to allow before raising a 166s :class:`~urllib3.exceptions.MaxRetryError` exception. 166s 166s Pass ``None`` to retry until you receive a response. Pass a 166s :class:`~urllib3.util.retry.Retry` object for fine-grained control 166s over different types of retries. 166s Pass an integer number to retry connection errors that many times, 166s but no other types of errors. Pass zero to never retry. 166s 166s If ``False``, then retries are disabled and any exception is raised 166s immediately. Also, instead of raising a MaxRetryError on redirects, 166s the redirect response will be returned. 166s 166s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 166s 166s :param timeout: 166s If specified, overrides the default timeout for this one 166s request. It may be a float (in seconds) or an instance of 166s :class:`urllib3.util.Timeout`. 166s 166s :param chunked: 166s If True, urllib3 will send the body using chunked transfer 166s encoding. Otherwise, urllib3 will send the body using the standard 166s content-length form. Defaults to False. 166s 166s :param response_conn: 166s Set this to ``None`` if you will handle releasing the connection or 166s set the connection to have the response release it. 166s 166s :param preload_content: 166s If True, the response's body will be preloaded during construction. 166s 166s :param decode_content: 166s If True, will attempt to decode the body based on the 166s 'content-encoding' header. 166s 166s :param enforce_content_length: 166s Enforce content length checking. Body returned by server must match 166s value of Content-Length header, if present. Otherwise, raise error. 166s """ 166s self.num_requests += 1 166s 166s timeout_obj = self._get_timeout(timeout) 166s timeout_obj.start_connect() 166s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 166s 166s try: 166s # Trigger any extra validation we need to do. 166s try: 166s self._validate_conn(conn) 166s except (SocketTimeout, BaseSSLError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 166s raise 166s 166s # _validate_conn() starts the connection to an HTTPS proxy 166s # so we need to wrap errors with 'ProxyError' here too. 166s except ( 166s OSError, 166s NewConnectionError, 166s TimeoutError, 166s BaseSSLError, 166s CertificateError, 166s SSLError, 166s ) as e: 166s new_e: Exception = e 166s if isinstance(e, (BaseSSLError, CertificateError)): 166s new_e = SSLError(e) 166s # If the connection didn't successfully connect to it's proxy 166s # then there 166s if isinstance( 166s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 166s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 166s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 166s raise new_e 166s 166s # conn.request() calls http.client.*.request, not the method in 166s # urllib3.request. It also calls makefile (recv) on the socket. 166s try: 166s conn.request( 166s method, 166s url, 166s body=body, 166s headers=headers, 166s chunked=chunked, 166s preload_content=preload_content, 166s decode_content=decode_content, 166s enforce_content_length=enforce_content_length, 166s ) 166s 166s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 166s # legitimately able to close the connection after sending a valid response. 166s # With this behaviour, the received response is still readable. 166s except BrokenPipeError: 166s pass 166s except OSError as e: 166s # MacOS/Linux 166s # EPROTOTYPE and ECONNRESET are needed on macOS 166s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 166s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 166s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 166s raise 166s 166s # Reset the timeout for the recv() on the socket 166s read_timeout = timeout_obj.read_timeout 166s 166s if not conn.is_closed: 166s # In Python 3 socket.py will catch EAGAIN and return None when you 166s # try and read into the file pointer created by http.client, which 166s # instead raises a BadStatusLine exception. Instead of catching 166s # the exception and assuming all BadStatusLine exceptions are read 166s # timeouts, check for a zero timeout before making the request. 166s if read_timeout == 0: 166s raise ReadTimeoutError( 166s self, url, f"Read timed out. (read timeout={read_timeout})" 166s ) 166s conn.timeout = read_timeout 166s 166s # Receive the response from the server 166s try: 166s response = conn.getresponse() 166s except (BaseSSLError, OSError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 166s raise 166s 166s # Set properties that are used by the pooling layer. 166s response.retries = retries 166s response._connection = response_conn # type: ignore[attr-defined] 166s response._pool = self # type: ignore[attr-defined] 166s 166s log.debug( 166s '%s://%s:%s "%s %s %s" %s %s', 166s self.scheme, 166s self.host, 166s self.port, 166s method, 166s url, 166s > response.version_string, 166s response.status, 166s response.length_remaining, 166s ) 166s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 166s 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 166s ----------------------------- Captured stderr call ----------------------------- 166s 127.0.0.1 - - [18/Jan/2025 03:25:37] "GET /basic-auth/user/passwd HTTP/1.1" 200 46 166s ___________________________ test_auth_failed[https] ____________________________ 166s 166s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_auth_failed_https_0') 166s httpbin_both = 166s verify_pool_mgr = 166s 166s def test_auth_failed(tmpdir, httpbin_both, verify_pool_mgr): 166s """Ensure that we can save failed auth statuses""" 166s auth = ("user", "wrongwrongwrong") 166s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 166s url = httpbin_both.url + "/basic-auth/user/passwd" 166s with vcr.use_cassette(str(tmpdir.join("auth-failed.yaml"))) as cass: 166s # Ensure that this is empty to begin with 166s assert_cassette_empty(cass) 166s > one = verify_pool_mgr.request("GET", url, headers=headers) 166s 166s tests/integration/test_urllib3.py:83: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 166s return self.request_encode_url( 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 166s return self.urlopen(method, url, **extra_kw) 166s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 166s response = conn.urlopen(method, u.request_uri, **kw) 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 166s response = self._make_request( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s self = 166s conn = 166s method = 'GET', url = '/basic-auth/user/passwd', body = None 166s headers = {'authorization': 'Basic dXNlcjp3cm9uZ3dyb25nd3Jvbmc='} 166s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 166s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 166s chunked = False, response_conn = None, preload_content = True 166s decode_content = True, enforce_content_length = True 166s 166s def _make_request( 166s self, 166s conn: BaseHTTPConnection, 166s method: str, 166s url: str, 166s body: _TYPE_BODY | None = None, 166s headers: typing.Mapping[str, str] | None = None, 166s retries: Retry | None = None, 166s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 166s chunked: bool = False, 166s response_conn: BaseHTTPConnection | None = None, 166s preload_content: bool = True, 166s decode_content: bool = True, 166s enforce_content_length: bool = True, 166s ) -> BaseHTTPResponse: 166s """ 166s Perform a request on a given urllib connection object taken from our 166s pool. 166s 166s :param conn: 166s a connection from one of our connection pools 166s 166s :param method: 166s HTTP request method (such as GET, POST, PUT, etc.) 166s 166s :param url: 166s The URL to perform the request on. 166s 166s :param body: 166s Data to send in the request body, either :class:`str`, :class:`bytes`, 166s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 166s 166s :param headers: 166s Dictionary of custom headers to send, such as User-Agent, 166s If-None-Match, etc. If None, pool headers are used. If provided, 166s these headers completely replace any pool-specific headers. 166s 166s :param retries: 166s Configure the number of retries to allow before raising a 166s :class:`~urllib3.exceptions.MaxRetryError` exception. 166s 166s Pass ``None`` to retry until you receive a response. Pass a 166s :class:`~urllib3.util.retry.Retry` object for fine-grained control 166s over different types of retries. 166s Pass an integer number to retry connection errors that many times, 166s but no other types of errors. Pass zero to never retry. 166s 166s If ``False``, then retries are disabled and any exception is raised 166s immediately. Also, instead of raising a MaxRetryError on redirects, 166s the redirect response will be returned. 166s 166s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 166s 166s :param timeout: 166s If specified, overrides the default timeout for this one 166s request. It may be a float (in seconds) or an instance of 166s :class:`urllib3.util.Timeout`. 166s 166s :param chunked: 166s If True, urllib3 will send the body using chunked transfer 166s encoding. Otherwise, urllib3 will send the body using the standard 166s content-length form. Defaults to False. 166s 166s :param response_conn: 166s Set this to ``None`` if you will handle releasing the connection or 166s set the connection to have the response release it. 166s 166s :param preload_content: 166s If True, the response's body will be preloaded during construction. 166s 166s :param decode_content: 166s If True, will attempt to decode the body based on the 166s 'content-encoding' header. 166s 166s :param enforce_content_length: 166s Enforce content length checking. Body returned by server must match 166s value of Content-Length header, if present. Otherwise, raise error. 166s """ 166s self.num_requests += 1 166s 166s timeout_obj = self._get_timeout(timeout) 166s timeout_obj.start_connect() 166s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 166s 166s try: 166s # Trigger any extra validation we need to do. 166s try: 166s self._validate_conn(conn) 166s except (SocketTimeout, BaseSSLError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 166s raise 166s 166s # _validate_conn() starts the connection to an HTTPS proxy 166s # so we need to wrap errors with 'ProxyError' here too. 166s except ( 166s OSError, 166s NewConnectionError, 166s TimeoutError, 166s BaseSSLError, 166s CertificateError, 166s SSLError, 166s ) as e: 166s new_e: Exception = e 166s if isinstance(e, (BaseSSLError, CertificateError)): 166s new_e = SSLError(e) 166s # If the connection didn't successfully connect to it's proxy 166s # then there 166s if isinstance( 166s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 166s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 166s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 166s raise new_e 166s 166s # conn.request() calls http.client.*.request, not the method in 166s # urllib3.request. It also calls makefile (recv) on the socket. 166s try: 166s conn.request( 166s method, 166s url, 166s body=body, 166s headers=headers, 166s chunked=chunked, 166s preload_content=preload_content, 166s decode_content=decode_content, 166s enforce_content_length=enforce_content_length, 166s ) 166s 166s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 166s # legitimately able to close the connection after sending a valid response. 166s # With this behaviour, the received response is still readable. 166s except BrokenPipeError: 166s pass 166s except OSError as e: 166s # MacOS/Linux 166s # EPROTOTYPE and ECONNRESET are needed on macOS 166s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 166s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 166s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 166s raise 166s 166s # Reset the timeout for the recv() on the socket 166s read_timeout = timeout_obj.read_timeout 166s 166s if not conn.is_closed: 166s # In Python 3 socket.py will catch EAGAIN and return None when you 166s # try and read into the file pointer created by http.client, which 166s # instead raises a BadStatusLine exception. Instead of catching 166s # the exception and assuming all BadStatusLine exceptions are read 166s # timeouts, check for a zero timeout before making the request. 166s if read_timeout == 0: 166s raise ReadTimeoutError( 166s self, url, f"Read timed out. (read timeout={read_timeout})" 166s ) 166s conn.timeout = read_timeout 166s 166s # Receive the response from the server 166s try: 166s response = conn.getresponse() 166s except (BaseSSLError, OSError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 166s raise 166s 166s # Set properties that are used by the pooling layer. 166s response.retries = retries 166s response._connection = response_conn # type: ignore[attr-defined] 166s response._pool = self # type: ignore[attr-defined] 166s 166s log.debug( 166s '%s://%s:%s "%s %s %s" %s %s', 166s self.scheme, 166s self.host, 166s self.port, 166s method, 166s url, 166s > response.version_string, 166s response.status, 166s response.length_remaining, 166s ) 166s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 166s 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 166s ----------------------------- Captured stderr call ----------------------------- 166s 127.0.0.1 - - [18/Jan/2025 03:25:37] "GET /basic-auth/user/passwd HTTP/1.1" 401 0 166s _______________________________ test_post[https] _______________________________ 166s 166s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_post_https_0') 166s httpbin_both = 166s verify_pool_mgr = 166s 166s def test_post(tmpdir, httpbin_both, verify_pool_mgr): 166s """Ensure that we can post and cache the results""" 166s data = {"key1": "value1", "key2": "value2"} 166s url = httpbin_both.url + "/post" 166s with vcr.use_cassette(str(tmpdir.join("verify_pool_mgr.yaml"))): 166s > req1 = verify_pool_mgr.request("POST", url, data).data 166s 166s tests/integration/test_urllib3.py:94: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:143: in request 166s return self.request_encode_body( 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:278: in request_encode_body 166s return self.urlopen(method, url, **extra_kw) 166s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 166s response = conn.urlopen(method, u.request_uri, **kw) 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 166s response = self._make_request( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s self = 166s conn = 166s method = 'POST', url = '/post', body = {'key1': 'value1', 'key2': 'value2'} 166s headers = HTTPHeaderDict({}) 166s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 166s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 166s chunked = False, response_conn = None, preload_content = True 166s decode_content = True, enforce_content_length = True 166s 166s def _make_request( 166s self, 166s conn: BaseHTTPConnection, 166s method: str, 166s url: str, 166s body: _TYPE_BODY | None = None, 166s headers: typing.Mapping[str, str] | None = None, 166s retries: Retry | None = None, 166s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 166s chunked: bool = False, 166s response_conn: BaseHTTPConnection | None = None, 166s preload_content: bool = True, 166s decode_content: bool = True, 166s enforce_content_length: bool = True, 166s ) -> BaseHTTPResponse: 166s """ 166s Perform a request on a given urllib connection object taken from our 166s pool. 166s 166s :param conn: 166s a connection from one of our connection pools 166s 166s :param method: 166s HTTP request method (such as GET, POST, PUT, etc.) 166s 166s :param url: 166s The URL to perform the request on. 166s 166s :param body: 166s Data to send in the request body, either :class:`str`, :class:`bytes`, 166s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 166s 166s :param headers: 166s Dictionary of custom headers to send, such as User-Agent, 166s If-None-Match, etc. If None, pool headers are used. If provided, 166s these headers completely replace any pool-specific headers. 166s 166s :param retries: 166s Configure the number of retries to allow before raising a 166s :class:`~urllib3.exceptions.MaxRetryError` exception. 166s 166s Pass ``None`` to retry until you receive a response. Pass a 166s :class:`~urllib3.util.retry.Retry` object for fine-grained control 166s over different types of retries. 166s Pass an integer number to retry connection errors that many times, 166s but no other types of errors. Pass zero to never retry. 166s 166s If ``False``, then retries are disabled and any exception is raised 166s immediately. Also, instead of raising a MaxRetryError on redirects, 166s the redirect response will be returned. 166s 166s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 166s 166s :param timeout: 166s If specified, overrides the default timeout for this one 166s request. It may be a float (in seconds) or an instance of 166s :class:`urllib3.util.Timeout`. 166s 166s :param chunked: 166s If True, urllib3 will send the body using chunked transfer 166s encoding. Otherwise, urllib3 will send the body using the standard 166s content-length form. Defaults to False. 166s 166s :param response_conn: 166s Set this to ``None`` if you will handle releasing the connection or 166s set the connection to have the response release it. 166s 166s :param preload_content: 166s If True, the response's body will be preloaded during construction. 166s 166s :param decode_content: 166s If True, will attempt to decode the body based on the 166s 'content-encoding' header. 166s 166s :param enforce_content_length: 166s Enforce content length checking. Body returned by server must match 166s value of Content-Length header, if present. Otherwise, raise error. 166s """ 166s self.num_requests += 1 166s 166s timeout_obj = self._get_timeout(timeout) 166s timeout_obj.start_connect() 166s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 166s 166s try: 166s # Trigger any extra validation we need to do. 166s try: 166s self._validate_conn(conn) 166s except (SocketTimeout, BaseSSLError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 166s raise 166s 166s # _validate_conn() starts the connection to an HTTPS proxy 166s # so we need to wrap errors with 'ProxyError' here too. 166s except ( 166s OSError, 166s NewConnectionError, 166s TimeoutError, 166s BaseSSLError, 166s CertificateError, 166s SSLError, 166s ) as e: 166s new_e: Exception = e 166s if isinstance(e, (BaseSSLError, CertificateError)): 166s new_e = SSLError(e) 166s # If the connection didn't successfully connect to it's proxy 166s # then there 166s if isinstance( 166s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 166s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 166s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 166s raise new_e 166s 166s # conn.request() calls http.client.*.request, not the method in 166s # urllib3.request. It also calls makefile (recv) on the socket. 166s try: 166s conn.request( 166s method, 166s url, 166s body=body, 166s headers=headers, 166s chunked=chunked, 166s preload_content=preload_content, 166s decode_content=decode_content, 166s enforce_content_length=enforce_content_length, 166s ) 166s 166s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 166s # legitimately able to close the connection after sending a valid response. 166s # With this behaviour, the received response is still readable. 166s except BrokenPipeError: 166s pass 166s except OSError as e: 166s # MacOS/Linux 166s # EPROTOTYPE and ECONNRESET are needed on macOS 166s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 166s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 166s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 166s raise 166s 166s # Reset the timeout for the recv() on the socket 166s read_timeout = timeout_obj.read_timeout 166s 166s if not conn.is_closed: 166s # In Python 3 socket.py will catch EAGAIN and return None when you 166s # try and read into the file pointer created by http.client, which 166s # instead raises a BadStatusLine exception. Instead of catching 166s # the exception and assuming all BadStatusLine exceptions are read 166s # timeouts, check for a zero timeout before making the request. 166s if read_timeout == 0: 166s raise ReadTimeoutError( 166s self, url, f"Read timed out. (read timeout={read_timeout})" 166s ) 166s conn.timeout = read_timeout 166s 166s # Receive the response from the server 166s try: 166s response = conn.getresponse() 166s except (BaseSSLError, OSError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 166s raise 166s 166s # Set properties that are used by the pooling layer. 166s response.retries = retries 166s response._connection = response_conn # type: ignore[attr-defined] 166s response._pool = self # type: ignore[attr-defined] 166s 166s log.debug( 166s '%s://%s:%s "%s %s %s" %s %s', 166s self.scheme, 166s self.host, 166s self.port, 166s method, 166s url, 166s > response.version_string, 166s response.status, 166s response.length_remaining, 166s ) 166s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 166s 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 166s ----------------------------- Captured stderr call ----------------------------- 166s 127.0.0.1 - - [18/Jan/2025 03:25:37] "POST /post HTTP/1.1" 501 159 166s _______________________________ test_gzip[https] _______________________________ 166s 166s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_gzip_https_0') 166s httpbin_both = 166s verify_pool_mgr = 166s 166s def test_gzip(tmpdir, httpbin_both, verify_pool_mgr): 166s """ 166s Ensure that requests (actually urllib3) is able to automatically decompress 166s the response body 166s """ 166s url = httpbin_both.url + "/gzip" 166s response = verify_pool_mgr.request("GET", url) 166s 166s with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))): 166s > response = verify_pool_mgr.request("GET", url) 166s 166s tests/integration/test_urllib3.py:140: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 166s return self.request_encode_url( 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 166s return self.urlopen(method, url, **extra_kw) 166s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 166s response = conn.urlopen(method, u.request_uri, **kw) 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 166s response = self._make_request( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s self = 166s conn = 166s method = 'GET', url = '/gzip', body = None, headers = {} 166s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 166s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 166s chunked = False, response_conn = None, preload_content = True 166s decode_content = True, enforce_content_length = True 166s 166s def _make_request( 166s self, 166s conn: BaseHTTPConnection, 166s method: str, 166s url: str, 166s body: _TYPE_BODY | None = None, 166s headers: typing.Mapping[str, str] | None = None, 166s retries: Retry | None = None, 166s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 166s chunked: bool = False, 166s response_conn: BaseHTTPConnection | None = None, 166s preload_content: bool = True, 166s decode_content: bool = True, 166s enforce_content_length: bool = True, 166s ) -> BaseHTTPResponse: 166s """ 166s Perform a request on a given urllib connection object taken from our 166s pool. 166s 166s :param conn: 166s a connection from one of our connection pools 166s 166s :param method: 166s HTTP request method (such as GET, POST, PUT, etc.) 166s 166s :param url: 166s The URL to perform the request on. 166s 166s :param body: 166s Data to send in the request body, either :class:`str`, :class:`bytes`, 166s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 166s 166s :param headers: 166s Dictionary of custom headers to send, such as User-Agent, 166s If-None-Match, etc. If None, pool headers are used. If provided, 166s these headers completely replace any pool-specific headers. 166s 166s :param retries: 166s Configure the number of retries to allow before raising a 166s :class:`~urllib3.exceptions.MaxRetryError` exception. 166s 166s Pass ``None`` to retry until you receive a response. Pass a 166s :class:`~urllib3.util.retry.Retry` object for fine-grained control 166s over different types of retries. 166s Pass an integer number to retry connection errors that many times, 166s but no other types of errors. Pass zero to never retry. 166s 166s If ``False``, then retries are disabled and any exception is raised 166s immediately. Also, instead of raising a MaxRetryError on redirects, 166s the redirect response will be returned. 166s 166s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 166s 166s :param timeout: 166s If specified, overrides the default timeout for this one 166s request. It may be a float (in seconds) or an instance of 166s :class:`urllib3.util.Timeout`. 166s 166s :param chunked: 166s If True, urllib3 will send the body using chunked transfer 166s encoding. Otherwise, urllib3 will send the body using the standard 166s content-length form. Defaults to False. 166s 166s :param response_conn: 166s Set this to ``None`` if you will handle releasing the connection or 166s set the connection to have the response release it. 166s 166s :param preload_content: 166s If True, the response's body will be preloaded during construction. 166s 166s :param decode_content: 166s If True, will attempt to decode the body based on the 166s 'content-encoding' header. 166s 166s :param enforce_content_length: 166s Enforce content length checking. Body returned by server must match 166s value of Content-Length header, if present. Otherwise, raise error. 166s """ 166s self.num_requests += 1 166s 166s timeout_obj = self._get_timeout(timeout) 166s timeout_obj.start_connect() 166s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 166s 166s try: 166s # Trigger any extra validation we need to do. 166s try: 166s self._validate_conn(conn) 166s except (SocketTimeout, BaseSSLError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 166s raise 166s 166s # _validate_conn() starts the connection to an HTTPS proxy 166s # so we need to wrap errors with 'ProxyError' here too. 166s except ( 166s OSError, 166s NewConnectionError, 166s TimeoutError, 166s BaseSSLError, 166s CertificateError, 166s SSLError, 166s ) as e: 166s new_e: Exception = e 166s if isinstance(e, (BaseSSLError, CertificateError)): 166s new_e = SSLError(e) 166s # If the connection didn't successfully connect to it's proxy 166s # then there 166s if isinstance( 166s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 166s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 166s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 166s raise new_e 166s 166s # conn.request() calls http.client.*.request, not the method in 166s # urllib3.request. It also calls makefile (recv) on the socket. 166s try: 166s conn.request( 166s method, 166s url, 166s body=body, 166s headers=headers, 166s chunked=chunked, 166s preload_content=preload_content, 166s decode_content=decode_content, 166s enforce_content_length=enforce_content_length, 166s ) 166s 166s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 166s # legitimately able to close the connection after sending a valid response. 166s # With this behaviour, the received response is still readable. 166s except BrokenPipeError: 166s pass 166s except OSError as e: 166s # MacOS/Linux 166s # EPROTOTYPE and ECONNRESET are needed on macOS 166s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 166s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 166s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 166s raise 166s 166s # Reset the timeout for the recv() on the socket 166s read_timeout = timeout_obj.read_timeout 166s 166s if not conn.is_closed: 166s # In Python 3 socket.py will catch EAGAIN and return None when you 166s # try and read into the file pointer created by http.client, which 166s # instead raises a BadStatusLine exception. Instead of catching 166s # the exception and assuming all BadStatusLine exceptions are read 166s # timeouts, check for a zero timeout before making the request. 166s if read_timeout == 0: 166s raise ReadTimeoutError( 166s self, url, f"Read timed out. (read timeout={read_timeout})" 166s ) 166s conn.timeout = read_timeout 166s 166s # Receive the response from the server 166s try: 166s response = conn.getresponse() 166s except (BaseSSLError, OSError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 166s raise 166s 166s # Set properties that are used by the pooling layer. 166s response.retries = retries 166s response._connection = response_conn # type: ignore[attr-defined] 166s response._pool = self # type: ignore[attr-defined] 166s 166s log.debug( 166s '%s://%s:%s "%s %s %s" %s %s', 166s self.scheme, 166s self.host, 166s self.port, 166s method, 166s url, 166s > response.version_string, 166s response.status, 166s response.length_remaining, 166s ) 166s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 166s 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 166s ----------------------------- Captured stderr call ----------------------------- 166s 127.0.0.1 - - [18/Jan/2025 03:25:37] "GET /gzip HTTP/1.1" 200 165 166s 127.0.0.1 - - [18/Jan/2025 03:25:37] "GET /gzip HTTP/1.1" 200 165 166s ________________________________ test_use_proxy ________________________________ 166s 166s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_use_proxy0') 166s httpbin = 166s proxy_server = 'http://0.0.0.0:50151' 166s 166s def test_use_proxy(tmpdir, httpbin, proxy_server): 166s """Ensure that it works with a proxy.""" 166s with vcr.use_cassette(str(tmpdir.join("proxy.yaml"))): 166s > response = requests.get(httpbin.url, proxies={"http": proxy_server}) 166s 166s tests/integration/test_proxy.py:53: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s /usr/lib/python3/dist-packages/requests/api.py:73: in get 166s return request("get", url, params=params, **kwargs) 166s /usr/lib/python3/dist-packages/requests/api.py:59: in request 166s return session.request(method=method, url=url, **kwargs) 166s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 166s resp = self.send(prep, **send_kwargs) 166s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 166s r = adapter.send(request, **kwargs) 166s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 166s resp = conn.urlopen( 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 166s response = self._make_request( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s self = 166s conn = 166s method = 'GET', url = 'http://127.0.0.1:33379/', body = None 166s headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 166s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 166s timeout = Timeout(connect=None, read=None, total=None), chunked = False 166s response_conn = 166s preload_content = False, decode_content = False, enforce_content_length = True 166s 166s def _make_request( 166s self, 166s conn: BaseHTTPConnection, 166s method: str, 166s url: str, 166s body: _TYPE_BODY | None = None, 166s headers: typing.Mapping[str, str] | None = None, 166s retries: Retry | None = None, 166s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 166s chunked: bool = False, 166s response_conn: BaseHTTPConnection | None = None, 166s preload_content: bool = True, 166s decode_content: bool = True, 166s enforce_content_length: bool = True, 166s ) -> BaseHTTPResponse: 166s """ 166s Perform a request on a given urllib connection object taken from our 166s pool. 166s 166s :param conn: 166s a connection from one of our connection pools 166s 166s :param method: 166s HTTP request method (such as GET, POST, PUT, etc.) 166s 166s :param url: 166s The URL to perform the request on. 166s 166s :param body: 166s Data to send in the request body, either :class:`str`, :class:`bytes`, 166s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 166s 166s :param headers: 166s Dictionary of custom headers to send, such as User-Agent, 166s If-None-Match, etc. If None, pool headers are used. If provided, 166s these headers completely replace any pool-specific headers. 166s 166s :param retries: 166s Configure the number of retries to allow before raising a 166s :class:`~urllib3.exceptions.MaxRetryError` exception. 166s 166s Pass ``None`` to retry until you receive a response. Pass a 166s :class:`~urllib3.util.retry.Retry` object for fine-grained control 166s over different types of retries. 166s Pass an integer number to retry connection errors that many times, 166s but no other types of errors. Pass zero to never retry. 166s 166s If ``False``, then retries are disabled and any exception is raised 166s immediately. Also, instead of raising a MaxRetryError on redirects, 166s the redirect response will be returned. 166s 166s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 166s 166s :param timeout: 166s If specified, overrides the default timeout for this one 166s request. It may be a float (in seconds) or an instance of 166s :class:`urllib3.util.Timeout`. 166s 166s :param chunked: 166s If True, urllib3 will send the body using chunked transfer 166s encoding. Otherwise, urllib3 will send the body using the standard 166s content-length form. Defaults to False. 166s 166s :param response_conn: 166s Set this to ``None`` if you will handle releasing the connection or 166s set the connection to have the response release it. 166s 166s :param preload_content: 166s If True, the response's body will be preloaded during construction. 166s 166s :param decode_content: 166s If True, will attempt to decode the body based on the 166s 'content-encoding' header. 166s 166s :param enforce_content_length: 166s Enforce content length checking. Body returned by server must match 166s value of Content-Length header, if present. Otherwise, raise error. 166s """ 166s self.num_requests += 1 166s 166s timeout_obj = self._get_timeout(timeout) 166s timeout_obj.start_connect() 166s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 166s 166s try: 166s # Trigger any extra validation we need to do. 166s try: 166s self._validate_conn(conn) 166s except (SocketTimeout, BaseSSLError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 166s raise 166s 166s # _validate_conn() starts the connection to an HTTPS proxy 166s # so we need to wrap errors with 'ProxyError' here too. 166s except ( 166s OSError, 166s NewConnectionError, 166s TimeoutError, 166s BaseSSLError, 166s CertificateError, 166s SSLError, 166s ) as e: 166s new_e: Exception = e 166s if isinstance(e, (BaseSSLError, CertificateError)): 166s new_e = SSLError(e) 166s # If the connection didn't successfully connect to it's proxy 166s # then there 166s if isinstance( 166s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 166s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 166s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 166s raise new_e 166s 166s # conn.request() calls http.client.*.request, not the method in 166s # urllib3.request. It also calls makefile (recv) on the socket. 166s try: 166s conn.request( 166s method, 166s url, 166s body=body, 166s headers=headers, 166s chunked=chunked, 166s preload_content=preload_content, 166s decode_content=decode_content, 166s enforce_content_length=enforce_content_length, 166s ) 166s 166s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 166s # legitimately able to close the connection after sending a valid response. 166s # With this behaviour, the received response is still readable. 166s except BrokenPipeError: 166s pass 166s except OSError as e: 166s # MacOS/Linux 166s # EPROTOTYPE and ECONNRESET are needed on macOS 166s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 166s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 166s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 166s raise 166s 166s # Reset the timeout for the recv() on the socket 166s read_timeout = timeout_obj.read_timeout 166s 166s if not conn.is_closed: 166s # In Python 3 socket.py will catch EAGAIN and return None when you 166s # try and read into the file pointer created by http.client, which 166s # instead raises a BadStatusLine exception. Instead of catching 166s # the exception and assuming all BadStatusLine exceptions are read 166s # timeouts, check for a zero timeout before making the request. 166s if read_timeout == 0: 166s raise ReadTimeoutError( 166s self, url, f"Read timed out. (read timeout={read_timeout})" 166s ) 166s conn.timeout = read_timeout 166s 166s # Receive the response from the server 166s try: 166s response = conn.getresponse() 166s except (BaseSSLError, OSError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 166s raise 166s 166s # Set properties that are used by the pooling layer. 166s response.retries = retries 166s response._connection = response_conn # type: ignore[attr-defined] 166s response._pool = self # type: ignore[attr-defined] 166s 166s log.debug( 166s '%s://%s:%s "%s %s %s" %s %s', 166s self.scheme, 166s self.host, 166s self.port, 166s method, 166s url, 166s > response.version_string, 166s response.status, 166s response.length_remaining, 166s ) 166s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 166s 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 166s ----------------------------- Captured stderr call ----------------------------- 166s 127.0.0.1 - - [18/Jan/2025 03:25:38] "GET / HTTP/1.1" 200 9358 166s 127.0.0.1 - - [18/Jan/2025 03:25:38] "GET http://127.0.0.1:33379/ HTTP/1.1" 200 - 166s ______________________________ test_cross_scheme _______________________________ 166s 166s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_cross_scheme2') 166s httpbin = 166s httpbin_secure = 166s verify_pool_mgr = 166s 166s def test_cross_scheme(tmpdir, httpbin, httpbin_secure, verify_pool_mgr): 166s """Ensure that requests between schemes are treated separately""" 166s # First fetch a url under http, and then again under https and then 166s # ensure that we haven't served anything out of cache, and we have two 166s # requests / response pairs in the cassette 166s with vcr.use_cassette(str(tmpdir.join("cross_scheme.yaml"))) as cass: 166s > verify_pool_mgr.request("GET", httpbin_secure.url) 166s 166s tests/integration/test_urllib3.py:125: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 166s return self.request_encode_url( 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 166s return self.urlopen(method, url, **extra_kw) 166s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 166s response = conn.urlopen(method, u.request_uri, **kw) 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 166s response = self._make_request( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s self = 166s conn = 166s method = 'GET', url = '/', body = None, headers = {} 166s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 166s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 166s chunked = False, response_conn = None, preload_content = True 166s decode_content = True, enforce_content_length = True 166s 166s def _make_request( 166s self, 166s conn: BaseHTTPConnection, 166s method: str, 166s url: str, 166s body: _TYPE_BODY | None = None, 166s headers: typing.Mapping[str, str] | None = None, 166s retries: Retry | None = None, 166s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 166s chunked: bool = False, 166s response_conn: BaseHTTPConnection | None = None, 166s preload_content: bool = True, 166s decode_content: bool = True, 166s enforce_content_length: bool = True, 166s ) -> BaseHTTPResponse: 166s """ 166s Perform a request on a given urllib connection object taken from our 166s pool. 166s 166s :param conn: 166s a connection from one of our connection pools 166s 166s :param method: 166s HTTP request method (such as GET, POST, PUT, etc.) 166s 166s :param url: 166s The URL to perform the request on. 166s 166s :param body: 166s Data to send in the request body, either :class:`str`, :class:`bytes`, 166s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 166s 166s :param headers: 166s Dictionary of custom headers to send, such as User-Agent, 166s If-None-Match, etc. If None, pool headers are used. If provided, 166s these headers completely replace any pool-specific headers. 166s 166s :param retries: 166s Configure the number of retries to allow before raising a 166s :class:`~urllib3.exceptions.MaxRetryError` exception. 166s 166s Pass ``None`` to retry until you receive a response. Pass a 166s :class:`~urllib3.util.retry.Retry` object for fine-grained control 166s over different types of retries. 166s Pass an integer number to retry connection errors that many times, 166s but no other types of errors. Pass zero to never retry. 166s 166s If ``False``, then retries are disabled and any exception is raised 166s immediately. Also, instead of raising a MaxRetryError on redirects, 166s the redirect response will be returned. 166s 166s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 166s 166s :param timeout: 166s If specified, overrides the default timeout for this one 166s request. It may be a float (in seconds) or an instance of 166s :class:`urllib3.util.Timeout`. 166s 166s :param chunked: 166s If True, urllib3 will send the body using chunked transfer 166s encoding. Otherwise, urllib3 will send the body using the standard 166s content-length form. Defaults to False. 166s 166s :param response_conn: 166s Set this to ``None`` if you will handle releasing the connection or 166s set the connection to have the response release it. 166s 166s :param preload_content: 166s If True, the response's body will be preloaded during construction. 166s 166s :param decode_content: 166s If True, will attempt to decode the body based on the 166s 'content-encoding' header. 166s 166s :param enforce_content_length: 166s Enforce content length checking. Body returned by server must match 166s value of Content-Length header, if present. Otherwise, raise error. 166s """ 166s self.num_requests += 1 166s 166s timeout_obj = self._get_timeout(timeout) 166s timeout_obj.start_connect() 166s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 166s 166s try: 166s # Trigger any extra validation we need to do. 166s try: 166s self._validate_conn(conn) 166s except (SocketTimeout, BaseSSLError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 166s raise 166s 166s # _validate_conn() starts the connection to an HTTPS proxy 166s # so we need to wrap errors with 'ProxyError' here too. 166s except ( 166s OSError, 166s NewConnectionError, 166s TimeoutError, 166s BaseSSLError, 166s CertificateError, 166s SSLError, 166s ) as e: 166s new_e: Exception = e 166s if isinstance(e, (BaseSSLError, CertificateError)): 166s new_e = SSLError(e) 166s # If the connection didn't successfully connect to it's proxy 166s # then there 166s if isinstance( 166s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 166s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 166s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 166s raise new_e 166s 166s # conn.request() calls http.client.*.request, not the method in 166s # urllib3.request. It also calls makefile (recv) on the socket. 166s try: 166s conn.request( 166s method, 166s url, 166s body=body, 166s headers=headers, 166s chunked=chunked, 166s preload_content=preload_content, 166s decode_content=decode_content, 166s enforce_content_length=enforce_content_length, 166s ) 166s 166s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 166s # legitimately able to close the connection after sending a valid response. 166s # With this behaviour, the received response is still readable. 166s except BrokenPipeError: 166s pass 166s except OSError as e: 166s # MacOS/Linux 166s # EPROTOTYPE and ECONNRESET are needed on macOS 166s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 166s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 166s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 166s raise 166s 166s # Reset the timeout for the recv() on the socket 166s read_timeout = timeout_obj.read_timeout 166s 166s if not conn.is_closed: 166s # In Python 3 socket.py will catch EAGAIN and return None when you 166s # try and read into the file pointer created by http.client, which 166s # instead raises a BadStatusLine exception. Instead of catching 166s # the exception and assuming all BadStatusLine exceptions are read 166s # timeouts, check for a zero timeout before making the request. 166s if read_timeout == 0: 166s raise ReadTimeoutError( 166s self, url, f"Read timed out. (read timeout={read_timeout})" 166s ) 166s conn.timeout = read_timeout 166s 166s # Receive the response from the server 166s try: 166s response = conn.getresponse() 166s except (BaseSSLError, OSError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 166s raise 166s 166s # Set properties that are used by the pooling layer. 166s response.retries = retries 166s response._connection = response_conn # type: ignore[attr-defined] 166s response._pool = self # type: ignore[attr-defined] 166s 166s log.debug( 166s '%s://%s:%s "%s %s %s" %s %s', 166s self.scheme, 166s self.host, 166s self.port, 166s method, 166s url, 166s > response.version_string, 166s response.status, 166s response.length_remaining, 166s ) 166s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 166s 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 166s ----------------------------- Captured stderr call ----------------------------- 166s 127.0.0.1 - - [18/Jan/2025 03:25:38] "GET / HTTP/1.1" 200 9358 166s ___________________ test_https_with_cert_validation_disabled ___________________ 166s 166s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_https_with_cert_validatio0') 166s httpbin_secure = 166s pool_mgr = 166s 166s def test_https_with_cert_validation_disabled(tmpdir, httpbin_secure, pool_mgr): 166s with vcr.use_cassette(str(tmpdir.join("cert_validation_disabled.yaml"))): 166s > pool_mgr.request("GET", httpbin_secure.url) 166s 166s tests/integration/test_urllib3.py:149: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 166s return self.request_encode_url( 166s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 166s return self.urlopen(method, url, **extra_kw) 166s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 166s response = conn.urlopen(method, u.request_uri, **kw) 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 166s response = self._make_request( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s self = 166s conn = 166s method = 'GET', url = '/', body = None, headers = {} 166s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 166s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 166s chunked = False, response_conn = None, preload_content = True 166s decode_content = True, enforce_content_length = True 166s 166s def _make_request( 166s self, 166s conn: BaseHTTPConnection, 166s method: str, 166s url: str, 166s body: _TYPE_BODY | None = None, 166s headers: typing.Mapping[str, str] | None = None, 166s retries: Retry | None = None, 166s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 166s chunked: bool = False, 166s response_conn: BaseHTTPConnection | None = None, 166s preload_content: bool = True, 166s decode_content: bool = True, 166s enforce_content_length: bool = True, 166s ) -> BaseHTTPResponse: 166s """ 166s Perform a request on a given urllib connection object taken from our 166s pool. 166s 166s :param conn: 166s a connection from one of our connection pools 166s 166s :param method: 166s HTTP request method (such as GET, POST, PUT, etc.) 166s 166s :param url: 166s The URL to perform the request on. 166s 166s :param body: 166s Data to send in the request body, either :class:`str`, :class:`bytes`, 166s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 166s 166s :param headers: 166s Dictionary of custom headers to send, such as User-Agent, 166s If-None-Match, etc. If None, pool headers are used. If provided, 166s these headers completely replace any pool-specific headers. 166s 166s :param retries: 166s Configure the number of retries to allow before raising a 166s :class:`~urllib3.exceptions.MaxRetryError` exception. 166s 166s Pass ``None`` to retry until you receive a response. Pass a 166s :class:`~urllib3.util.retry.Retry` object for fine-grained control 166s over different types of retries. 166s Pass an integer number to retry connection errors that many times, 166s but no other types of errors. Pass zero to never retry. 166s 166s If ``False``, then retries are disabled and any exception is raised 166s immediately. Also, instead of raising a MaxRetryError on redirects, 166s the redirect response will be returned. 166s 166s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 166s 166s :param timeout: 166s If specified, overrides the default timeout for this one 166s request. It may be a float (in seconds) or an instance of 166s :class:`urllib3.util.Timeout`. 166s 166s :param chunked: 166s If True, urllib3 will send the body using chunked transfer 166s encoding. Otherwise, urllib3 will send the body using the standard 166s content-length form. Defaults to False. 166s 166s :param response_conn: 166s Set this to ``None`` if you will handle releasing the connection or 166s set the connection to have the response release it. 166s 166s :param preload_content: 166s If True, the response's body will be preloaded during construction. 166s 166s :param decode_content: 166s If True, will attempt to decode the body based on the 166s 'content-encoding' header. 166s 166s :param enforce_content_length: 166s Enforce content length checking. Body returned by server must match 166s value of Content-Length header, if present. Otherwise, raise error. 166s """ 166s self.num_requests += 1 166s 166s timeout_obj = self._get_timeout(timeout) 166s timeout_obj.start_connect() 166s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 166s 166s try: 166s # Trigger any extra validation we need to do. 166s try: 166s self._validate_conn(conn) 166s except (SocketTimeout, BaseSSLError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 166s raise 166s 166s # _validate_conn() starts the connection to an HTTPS proxy 166s # so we need to wrap errors with 'ProxyError' here too. 166s except ( 166s OSError, 166s NewConnectionError, 166s TimeoutError, 166s BaseSSLError, 166s CertificateError, 166s SSLError, 166s ) as e: 166s new_e: Exception = e 166s if isinstance(e, (BaseSSLError, CertificateError)): 166s new_e = SSLError(e) 166s # If the connection didn't successfully connect to it's proxy 166s # then there 166s if isinstance( 166s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 166s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 166s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 166s raise new_e 166s 166s # conn.request() calls http.client.*.request, not the method in 166s # urllib3.request. It also calls makefile (recv) on the socket. 166s try: 166s conn.request( 166s method, 166s url, 166s body=body, 166s headers=headers, 166s chunked=chunked, 166s preload_content=preload_content, 166s decode_content=decode_content, 166s enforce_content_length=enforce_content_length, 166s ) 166s 166s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 166s # legitimately able to close the connection after sending a valid response. 166s # With this behaviour, the received response is still readable. 166s except BrokenPipeError: 166s pass 166s except OSError as e: 166s # MacOS/Linux 166s # EPROTOTYPE and ECONNRESET are needed on macOS 166s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 166s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 166s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 166s raise 166s 166s # Reset the timeout for the recv() on the socket 166s read_timeout = timeout_obj.read_timeout 166s 166s if not conn.is_closed: 166s # In Python 3 socket.py will catch EAGAIN and return None when you 166s # try and read into the file pointer created by http.client, which 166s # instead raises a BadStatusLine exception. Instead of catching 166s # the exception and assuming all BadStatusLine exceptions are read 166s # timeouts, check for a zero timeout before making the request. 166s if read_timeout == 0: 166s raise ReadTimeoutError( 166s self, url, f"Read timed out. (read timeout={read_timeout})" 166s ) 166s conn.timeout = read_timeout 166s 166s # Receive the response from the server 166s try: 166s response = conn.getresponse() 166s except (BaseSSLError, OSError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 166s raise 166s 166s # Set properties that are used by the pooling layer. 166s response.retries = retries 166s response._connection = response_conn # type: ignore[attr-defined] 166s response._pool = self # type: ignore[attr-defined] 166s 166s log.debug( 166s '%s://%s:%s "%s %s %s" %s %s', 166s self.scheme, 166s self.host, 166s self.port, 166s method, 166s url, 166s > response.version_string, 166s response.status, 166s response.length_remaining, 166s ) 166s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 166s 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 166s ----------------------------- Captured stderr call ----------------------------- 166s 127.0.0.1 - - [18/Jan/2025 03:25:38] "GET / HTTP/1.1" 200 9358 166s _____________________________ test_domain_redirect _____________________________ 166s 166s def test_domain_redirect(): 166s """Ensure that redirects across domains are considered unique""" 166s # In this example, seomoz.org redirects to moz.com, and if those 166s # requests are considered identical, then we'll be stuck in a redirect 166s # loop. 166s url = "http://seomoz.org/" 166s with vcr.use_cassette("tests/fixtures/wild/domain_redirect.yaml") as cass: 166s > requests.get(url, headers={"User-Agent": "vcrpy-test"}) 166s 166s tests/integration/test_wild.py:20: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s /usr/lib/python3/dist-packages/requests/api.py:73: in get 166s return request("get", url, params=params, **kwargs) 166s /usr/lib/python3/dist-packages/requests/api.py:59: in request 166s return session.request(method=method, url=url, **kwargs) 166s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 166s resp = self.send(prep, **send_kwargs) 166s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 166s r = adapter.send(request, **kwargs) 166s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 166s resp = conn.urlopen( 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 166s response = self._make_request( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s self = 166s conn = 166s method = 'GET', url = '/', body = None 166s headers = {'User-Agent': 'vcrpy-test', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 166s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 166s timeout = Timeout(connect=None, read=None, total=None), chunked = False 166s response_conn = 166s preload_content = False, decode_content = False, enforce_content_length = True 166s 166s def _make_request( 166s self, 166s conn: BaseHTTPConnection, 166s method: str, 166s url: str, 166s body: _TYPE_BODY | None = None, 166s headers: typing.Mapping[str, str] | None = None, 166s retries: Retry | None = None, 166s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 166s chunked: bool = False, 166s response_conn: BaseHTTPConnection | None = None, 166s preload_content: bool = True, 166s decode_content: bool = True, 166s enforce_content_length: bool = True, 166s ) -> BaseHTTPResponse: 166s """ 166s Perform a request on a given urllib connection object taken from our 166s pool. 166s 166s :param conn: 166s a connection from one of our connection pools 166s 166s :param method: 166s HTTP request method (such as GET, POST, PUT, etc.) 166s 166s :param url: 166s The URL to perform the request on. 166s 166s :param body: 166s Data to send in the request body, either :class:`str`, :class:`bytes`, 166s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 166s 166s :param headers: 166s Dictionary of custom headers to send, such as User-Agent, 166s If-None-Match, etc. If None, pool headers are used. If provided, 166s these headers completely replace any pool-specific headers. 166s 166s :param retries: 166s Configure the number of retries to allow before raising a 166s :class:`~urllib3.exceptions.MaxRetryError` exception. 166s 166s Pass ``None`` to retry until you receive a response. Pass a 166s :class:`~urllib3.util.retry.Retry` object for fine-grained control 166s over different types of retries. 166s Pass an integer number to retry connection errors that many times, 166s but no other types of errors. Pass zero to never retry. 166s 166s If ``False``, then retries are disabled and any exception is raised 166s immediately. Also, instead of raising a MaxRetryError on redirects, 166s the redirect response will be returned. 166s 166s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 166s 166s :param timeout: 166s If specified, overrides the default timeout for this one 166s request. It may be a float (in seconds) or an instance of 166s :class:`urllib3.util.Timeout`. 166s 166s :param chunked: 166s If True, urllib3 will send the body using chunked transfer 166s encoding. Otherwise, urllib3 will send the body using the standard 166s content-length form. Defaults to False. 166s 166s :param response_conn: 166s Set this to ``None`` if you will handle releasing the connection or 166s set the connection to have the response release it. 166s 166s :param preload_content: 166s If True, the response's body will be preloaded during construction. 166s 166s :param decode_content: 166s If True, will attempt to decode the body based on the 166s 'content-encoding' header. 166s 166s :param enforce_content_length: 166s Enforce content length checking. Body returned by server must match 166s value of Content-Length header, if present. Otherwise, raise error. 166s """ 166s self.num_requests += 1 166s 166s timeout_obj = self._get_timeout(timeout) 166s timeout_obj.start_connect() 166s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 166s 166s try: 166s # Trigger any extra validation we need to do. 166s try: 166s self._validate_conn(conn) 166s except (SocketTimeout, BaseSSLError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 166s raise 166s 166s # _validate_conn() starts the connection to an HTTPS proxy 166s # so we need to wrap errors with 'ProxyError' here too. 166s except ( 166s OSError, 166s NewConnectionError, 166s TimeoutError, 166s BaseSSLError, 166s CertificateError, 166s SSLError, 166s ) as e: 166s new_e: Exception = e 166s if isinstance(e, (BaseSSLError, CertificateError)): 166s new_e = SSLError(e) 166s # If the connection didn't successfully connect to it's proxy 166s # then there 166s if isinstance( 166s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 166s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 166s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 166s raise new_e 166s 166s # conn.request() calls http.client.*.request, not the method in 166s # urllib3.request. It also calls makefile (recv) on the socket. 166s try: 166s conn.request( 166s method, 166s url, 166s body=body, 166s headers=headers, 166s chunked=chunked, 166s preload_content=preload_content, 166s decode_content=decode_content, 166s enforce_content_length=enforce_content_length, 166s ) 166s 166s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 166s # legitimately able to close the connection after sending a valid response. 166s # With this behaviour, the received response is still readable. 166s except BrokenPipeError: 166s pass 166s except OSError as e: 166s # MacOS/Linux 166s # EPROTOTYPE and ECONNRESET are needed on macOS 166s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 166s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 166s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 166s raise 166s 166s # Reset the timeout for the recv() on the socket 166s read_timeout = timeout_obj.read_timeout 166s 166s if not conn.is_closed: 166s # In Python 3 socket.py will catch EAGAIN and return None when you 166s # try and read into the file pointer created by http.client, which 166s # instead raises a BadStatusLine exception. Instead of catching 166s # the exception and assuming all BadStatusLine exceptions are read 166s # timeouts, check for a zero timeout before making the request. 166s if read_timeout == 0: 166s raise ReadTimeoutError( 166s self, url, f"Read timed out. (read timeout={read_timeout})" 166s ) 166s conn.timeout = read_timeout 166s 166s # Receive the response from the server 166s try: 166s response = conn.getresponse() 166s except (BaseSSLError, OSError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 166s raise 166s 166s # Set properties that are used by the pooling layer. 166s response.retries = retries 166s response._connection = response_conn # type: ignore[attr-defined] 166s response._pool = self # type: ignore[attr-defined] 166s 166s log.debug( 166s '%s://%s:%s "%s %s %s" %s %s', 166s self.scheme, 166s self.host, 166s self.port, 166s method, 166s url, 166s > response.version_string, 166s response.status, 166s response.length_remaining, 166s ) 166s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 166s 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 166s _________________________________ test_cookies _________________________________ 166s 166s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-0/test_cookies0') 166s httpbin = 166s 166s def test_cookies(tmpdir, httpbin): 166s testfile = str(tmpdir.join("cookies.yml")) 166s with vcr.use_cassette(testfile): 166s with requests.Session() as s: 166s > s.get(httpbin.url + "/cookies/set?k1=v1&k2=v2") 166s 166s tests/integration/test_wild.py:67: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 166s return self.request("GET", url, **kwargs) 166s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 166s resp = self.send(prep, **send_kwargs) 166s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 166s r = adapter.send(request, **kwargs) 166s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 166s resp = conn.urlopen( 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 166s response = self._make_request( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s self = 166s conn = 166s method = 'GET', url = '/cookies/set?k1=v1&k2=v2', body = None 166s headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 166s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 166s timeout = Timeout(connect=None, read=None, total=None), chunked = False 166s response_conn = 166s preload_content = False, decode_content = False, enforce_content_length = True 166s 166s def _make_request( 166s self, 166s conn: BaseHTTPConnection, 166s method: str, 166s url: str, 166s body: _TYPE_BODY | None = None, 166s headers: typing.Mapping[str, str] | None = None, 166s retries: Retry | None = None, 166s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 166s chunked: bool = False, 166s response_conn: BaseHTTPConnection | None = None, 166s preload_content: bool = True, 166s decode_content: bool = True, 166s enforce_content_length: bool = True, 166s ) -> BaseHTTPResponse: 166s """ 166s Perform a request on a given urllib connection object taken from our 166s pool. 166s 166s :param conn: 166s a connection from one of our connection pools 166s 166s :param method: 166s HTTP request method (such as GET, POST, PUT, etc.) 166s 166s :param url: 166s The URL to perform the request on. 166s 166s :param body: 166s Data to send in the request body, either :class:`str`, :class:`bytes`, 166s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 166s 166s :param headers: 166s Dictionary of custom headers to send, such as User-Agent, 166s If-None-Match, etc. If None, pool headers are used. If provided, 166s these headers completely replace any pool-specific headers. 166s 166s :param retries: 166s Configure the number of retries to allow before raising a 166s :class:`~urllib3.exceptions.MaxRetryError` exception. 166s 166s Pass ``None`` to retry until you receive a response. Pass a 166s :class:`~urllib3.util.retry.Retry` object for fine-grained control 166s over different types of retries. 166s PaE: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build; python3.13 -m pytest --ignore tests/integration/test_aiohttp.py --ignore tests/integration/test_tornado.py --ignore tests/integration/test_requests.py -m "not online" -k "not test_basic_json_use and not test_load_cassette_with_custom_persister" 166s I: pybuild base:311: cd /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build; python3.12 -m pytest --ignore tests/integration/test_aiohttp.py --ignore tests/integration/test_tornado.py --ignore tests/integration/test_requests.py -m "not online" -k "not test_basic_json_use and not test_load_cassette_with_custom_persister" 166s ss an integer number to retry connection errors that many times, 166s but no other types of errors. Pass zero to never retry. 166s 166s If ``False``, then retries are disabled and any exception is raised 166s immediately. Also, instead of raising a MaxRetryError on redirects, 166s the redirect response will be returned. 166s 166s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 166s 166s :param timeout: 166s If specified, overrides the default timeout for this one 166s request. It may be a float (in seconds) or an instance of 166s :class:`urllib3.util.Timeout`. 166s 166s :param chunked: 166s If True, urllib3 will send the body using chunked transfer 166s encoding. Otherwise, urllib3 will send the body using the standard 166s content-length form. Defaults to False. 166s 166s :param response_conn: 166s Set this to ``None`` if you will handle releasing the connection or 166s set the connection to have the response release it. 166s 166s :param preload_content: 166s If True, the response's body will be preloaded during construction. 166s 166s :param decode_content: 166s If True, will attempt to decode the body based on the 166s 'content-encoding' header. 166s 166s :param enforce_content_length: 166s Enforce content length checking. Body returned by server must match 166s value of Content-Length header, if present. Otherwise, raise error. 166s """ 166s self.num_requests += 1 166s 166s timeout_obj = self._get_timeout(timeout) 166s timeout_obj.start_connect() 166s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 166s 166s try: 166s # Trigger any extra validation we need to do. 166s try: 166s self._validate_conn(conn) 166s except (SocketTimeout, BaseSSLError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 166s raise 166s 166s # _validate_conn() starts the connection to an HTTPS proxy 166s # so we need to wrap errors with 'ProxyError' here too. 166s except ( 166s OSError, 166s NewConnectionError, 166s TimeoutError, 166s BaseSSLError, 166s CertificateError, 166s SSLError, 166s ) as e: 166s new_e: Exception = e 166s if isinstance(e, (BaseSSLError, CertificateError)): 166s new_e = SSLError(e) 166s # If the connection didn't successfully connect to it's proxy 166s # then there 166s if isinstance( 166s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 166s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 166s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 166s raise new_e 166s 166s # conn.request() calls http.client.*.request, not the method in 166s # urllib3.request. It also calls makefile (recv) on the socket. 166s try: 166s conn.request( 166s method, 166s url, 166s body=body, 166s headers=headers, 166s chunked=chunked, 166s preload_content=preload_content, 166s decode_content=decode_content, 166s enforce_content_length=enforce_content_length, 166s ) 166s 166s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 166s # legitimately able to close the connection after sending a valid response. 166s # With this behaviour, the received response is still readable. 166s except BrokenPipeError: 166s pass 166s except OSError as e: 166s # MacOS/Linux 166s # EPROTOTYPE and ECONNRESET are needed on macOS 166s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 166s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 166s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 166s raise 166s 166s # Reset the timeout for the recv() on the socket 166s read_timeout = timeout_obj.read_timeout 166s 166s if not conn.is_closed: 166s # In Python 3 socket.py will catch EAGAIN and return None when you 166s # try and read into the file pointer created by http.client, which 166s # instead raises a BadStatusLine exception. Instead of catching 166s # the exception and assuming all BadStatusLine exceptions are read 166s # timeouts, check for a zero timeout before making the request. 166s if read_timeout == 0: 166s raise ReadTimeoutError( 166s self, url, f"Read timed out. (read timeout={read_timeout})" 166s ) 166s conn.timeout = read_timeout 166s 166s # Receive the response from the server 166s try: 166s response = conn.getresponse() 166s except (BaseSSLError, OSError) as e: 166s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 166s raise 166s 166s # Set properties that are used by the pooling layer. 166s response.retries = retries 166s response._connection = response_conn # type: ignore[attr-defined] 166s response._pool = self # type: ignore[attr-defined] 166s 166s log.debug( 166s '%s://%s:%s "%s %s %s" %s %s', 166s self.scheme, 166s self.host, 166s self.port, 166s method, 166s url, 166s > response.version_string, 166s response.status, 166s response.length_remaining, 166s ) 166s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 166s 166s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 166s ----------------------------- Captured stderr call ----------------------------- 166s 127.0.0.1 - - [18/Jan/2025 03:25:38] "GET /cookies/set?k1=v1&k2=v2 HTTP/1.1" 302 203 166s =============================== warnings summary =============================== 166s tests/integration/test_config.py:10 166s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_config.py:10: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 166s @pytest.mark.online 166s 166s tests/integration/test_config.py:24 166s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_config.py:24: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 166s @pytest.mark.online 166s 166s tests/integration/test_config.py:34 166s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_config.py:34: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 166s @pytest.mark.online 166s 166s tests/integration/test_config.py:47 166s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_config.py:47: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 166s @pytest.mark.online 166s 166s tests/integration/test_config.py:69 166s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_config.py:69: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 166s @pytest.mark.online 166s 166s tests/integration/test_disksaver.py:14 166s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_disksaver.py:14: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 166s @pytest.mark.online 166s 166s tests/integration/test_disksaver.py:35 166s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_disksaver.py:35: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 166s @pytest.mark.online 166s 166s tests/integration/test_httplib2.py:60 166s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_httplib2.py:60: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 166s @pytest.mark.online 166s 166s tests/integration/test_register_matcher.py:16 166s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_register_matcher.py:16: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 166s @pytest.mark.online 166s 166s tests/integration/test_register_matcher.py:32 166s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_register_matcher.py:32: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 166s @pytest.mark.online 166s 166s tests/integration/test_urllib2.py:60 166s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_urllib2.py:60: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 166s @mark.online 166s 166s tests/integration/test_urllib3.py:102 166s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_urllib3.py:102: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 166s @pytest.mark.online 166s 166s tests/integration/test_wild.py:55 166s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_wild.py:55: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 166s @pytest.mark.online 166s 166s tests/integration/test_wild.py:74 166s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_wild.py:74: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 166s @pytest.mark.online 166s 166s tests/unit/test_stubs.py:20 166s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/unit/test_stubs.py:20: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 166s @mark.online 166s 166s tests/unit/test_unittest.py:131 166s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/unit/test_unittest.py:131: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 166s @pytest.mark.online 166s 166s tests/unit/test_unittest.py:166 166s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/unit/test_unittest.py:166: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 166s @pytest.mark.online 166s 166s tests/integration/test_wild.py::test_xmlrpclib 166s /usr/lib/python3.13/multiprocessing/popen_fork.py:67: DeprecationWarning: This process (pid=3484) is multi-threaded, use of fork() may lead to deadlocks in the child. 166s self.pid = os.fork() 166s 166s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 166s =========================== short test summary info ============================ 166s FAILED tests/integration/test_urllib3.py::test_status_code[http] - AttributeE... 166s FAILED tests/integration/test_urllib3.py::test_headers[http] - AttributeError... 166s FAILED tests/integration/test_urllib3.py::test_body[http] - AttributeError: '... 166s FAILED tests/integration/test_urllib3.py::test_auth[http] - AttributeError: '... 166s FAILED tests/integration/test_urllib3.py::test_auth_failed[http] - AttributeE... 166s FAILED tests/integration/test_urllib3.py::test_post[http] - urllib3.exception... 166s FAILED tests/integration/test_urllib3.py::test_gzip[http] - AttributeError: '... 166s FAILED tests/integration/test_urllib3.py::test_status_code[https] - Attribute... 166s FAILED tests/integration/test_urllib3.py::test_headers[https] - AttributeErro... 166s FAILED tests/integration/test_urllib3.py::test_body[https] - AttributeError: ... 166s FAILED tests/integration/test_urllib3.py::test_auth[https] - AttributeError: ... 166s FAILED tests/integration/test_urllib3.py::test_auth_failed[https] - Attribute... 166s FAILED tests/integration/test_urllib3.py::test_post[https] - AttributeError: ... 166s FAILED tests/integration/test_urllib3.py::test_gzip[https] - AttributeError: ... 166s FAILED tests/integration/test_proxy.py::test_use_proxy - AttributeError: 'VCR... 166s FAILED tests/integration/test_urllib3.py::test_cross_scheme - AttributeError:... 166s FAILED tests/integration/test_urllib3.py::test_https_with_cert_validation_disabled 166s FAILED tests/integration/test_wild.py::test_domain_redirect - AttributeError:... 166s FAILED tests/integration/test_wild.py::test_cookies - AttributeError: 'VCRHTT... 166s ==== 19 failed, 265 passed, 3 skipped, 19 deselected, 18 warnings in 5.45s ===== 168s ============================= test session starts ============================== 168s platform linux -- Python 3.12.8, pytest-8.3.4, pluggy-1.5.0 168s rootdir: /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build 168s plugins: typeguard-4.4.1, httpbin-2.1.0, tornado-0.8.1 168s collected 305 items / 19 deselected / 1 skipped / 286 selected 168s 168s tests/integration/test_basic.py .... [ 1%] 168s tests/integration/test_boto3.py ss [ 2%] 168s tests/integration/test_config.py . [ 2%] 168s tests/integration/test_filter.py .......... [ 5%] 168s tests/integration/test_httplib2.py ........ [ 8%] 168s tests/integration/test_urllib2.py ........ [ 11%] 169s tests/integration/test_urllib3.py FFFFFFF [ 13%] 169s tests/integration/test_httplib2.py ........ [ 16%] 169s tests/integration/test_urllib2.py ........ [ 19%] 170s tests/integration/test_urllib3.py FFFFFFF [ 22%] 170s tests/integration/test_httplib2.py . [ 22%] 170s tests/integration/test_ignore.py .... [ 23%] 170s tests/integration/test_matchers.py .............. [ 28%] 170s tests/integration/test_multiple.py . [ 29%] 170s tests/integration/test_proxy.py F [ 29%] 170s tests/integration/test_record_mode.py ........ [ 32%] 170s tests/integration/test_register_persister.py .. [ 32%] 170s tests/integration/test_register_serializer.py . [ 33%] 170s tests/integration/test_request.py .. [ 33%] 170s tests/integration/test_stubs.py .... [ 35%] 170s tests/integration/test_urllib2.py . [ 35%] 170s tests/integration/test_urllib3.py FF. [ 36%] 171s tests/integration/test_wild.py F.F. [ 38%] 171s tests/unit/test_cassettes.py ............................... [ 48%] 171s tests/unit/test_errors.py .... [ 50%] 171s tests/unit/test_filters.py ........................ [ 58%] 171s tests/unit/test_json_serializer.py . [ 59%] 171s tests/unit/test_matchers.py ............................ [ 68%] 171s tests/unit/test_migration.py ... [ 69%] 171s tests/unit/test_persist.py .... [ 71%] 171s tests/unit/test_request.py ................. [ 77%] 171s tests/unit/test_response.py .... [ 78%] 171s tests/unit/test_serialize.py ............... [ 83%] 171s tests/unit/test_stubs.py ... [ 84%] 171s tests/unit/test_unittest.py ....... [ 87%] 171s tests/unit/test_util.py ........... [ 91%] 171s tests/unit/test_vcr.py ........................ [ 99%] 172s tests/unit/test_vcr_import.py . [100%] 172s 172s =================================== FAILURES =================================== 172s ____________________________ test_status_code[http] ____________________________ 172s 172s httpbin_both = 172s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_status_code_http_0') 172s verify_pool_mgr = 172s 172s def test_status_code(httpbin_both, tmpdir, verify_pool_mgr): 172s """Ensure that we can read the status code""" 172s url = httpbin_both.url 172s with vcr.use_cassette(str(tmpdir.join("atts.yaml"))): 172s > status_code = verify_pool_mgr.request("GET", url).status 172s 172s tests/integration/test_urllib3.py:34: 172s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 172s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 172s return self.request_encode_url( 172s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 172s return self.urlopen(method, url, **extra_kw) 172s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 172s response = conn.urlopen(method, u.request_uri, **kw) 172s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 172s response = self._make_request( 172s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 172s 172s self = 172s conn = 172s method = 'GET', url = '/', body = None, headers = {} 172s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 172s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 172s chunked = False, response_conn = None, preload_content = True 172s decode_content = True, enforce_content_length = True 172s 172s def _make_request( 172s self, 172s conn: BaseHTTPConnection, 172s method: str, 172s url: str, 172s body: _TYPE_BODY | None = None, 172s headers: typing.Mapping[str, str] | None = None, 172s retries: Retry | None = None, 172s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 172s chunked: bool = False, 172s response_conn: BaseHTTPConnection | None = None, 172s preload_content: bool = True, 172s decode_content: bool = True, 172s enforce_content_length: bool = True, 172s ) -> BaseHTTPResponse: 172s """ 172s Perform a request on a given urllib connection object taken from our 172s pool. 172s 172s :param conn: 172s a connection from one of our connection pools 172s 172s :param method: 172s HTTP request method (such as GET, POST, PUT, etc.) 172s 172s :param url: 172s The URL to perform the request on. 172s 172s :param body: 172s Data to send in the request body, either :class:`str`, :class:`bytes`, 172s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 172s 172s :param headers: 172s Dictionary of custom headers to send, such as User-Agent, 172s If-None-Match, etc. If None, pool headers are used. If provided, 172s these headers completely replace any pool-specific headers. 172s 172s :param retries: 172s Configure the number of retries to allow before raising a 172s :class:`~urllib3.exceptions.MaxRetryError` exception. 172s 172s Pass ``None`` to retry until you receive a response. Pass a 172s :class:`~urllib3.util.retry.Retry` object for fine-grained control 172s over different types of retries. 172s Pass an integer number to retry connection errors that many times, 172s but no other types of errors. Pass zero to never retry. 172s 172s If ``False``, then retries are disabled and any exception is raised 172s immediately. Also, instead of raising a MaxRetryError on redirects, 172s the redirect response will be returned. 172s 172s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 172s 172s :param timeout: 172s If specified, overrides the default timeout for this one 172s request. It may be a float (in seconds) or an instance of 172s :class:`urllib3.util.Timeout`. 172s 172s :param chunked: 172s If True, urllib3 will send the body using chunked transfer 172s encoding. Otherwise, urllib3 will send the body using the standard 172s content-length form. Defaults to False. 172s 172s :param response_conn: 172s Set this to ``None`` if you will handle releasing the connection or 172s set the connection to have the response release it. 172s 172s :param preload_content: 172s If True, the response's body will be preloaded during construction. 172s 172s :param decode_content: 172s If True, will attempt to decode the body based on the 172s 'content-encoding' header. 172s 172s :param enforce_content_length: 172s Enforce content length checking. Body returned by server must match 172s value of Content-Length header, if present. Otherwise, raise error. 172s """ 172s self.num_requests += 1 172s 172s timeout_obj = self._get_timeout(timeout) 172s timeout_obj.start_connect() 172s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 172s 172s try: 172s # Trigger any extra validation we need to do. 172s try: 172s self._validate_conn(conn) 172s except (SocketTimeout, BaseSSLError) as e: 172s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 172s raise 172s 172s # _validate_conn() starts the connection to an HTTPS proxy 172s # so we need to wrap errors with 'ProxyError' here too. 172s except ( 172s OSError, 172s NewConnectionError, 172s TimeoutError, 172s BaseSSLError, 172s CertificateError, 172s SSLError, 172s ) as e: 172s new_e: Exception = e 172s if isinstance(e, (BaseSSLError, CertificateError)): 172s new_e = SSLError(e) 172s # If the connection didn't successfully connect to it's proxy 172s # then there 172s if isinstance( 172s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 172s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 172s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 172s raise new_e 172s 172s # conn.request() calls http.client.*.request, not the method in 172s # urllib3.request. It also calls makefile (recv) on the socket. 172s try: 172s conn.request( 172s method, 172s url, 172s body=body, 172s headers=headers, 172s chunked=chunked, 172s preload_content=preload_content, 172s decode_content=decode_content, 172s enforce_content_length=enforce_content_length, 172s ) 172s 172s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 172s # legitimately able to close the connection after sending a valid response. 172s # With this behaviour, the received response is still readable. 172s except BrokenPipeError: 172s pass 172s except OSError as e: 172s # MacOS/Linux 172s # EPROTOTYPE and ECONNRESET are needed on macOS 172s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 172s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 172s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 172s raise 172s 172s # Reset the timeout for the recv() on the socket 172s read_timeout = timeout_obj.read_timeout 172s 172s if not conn.is_closed: 172s # In Python 3 socket.py will catch EAGAIN and return None when you 172s # try and read into the file pointer created by http.client, which 172s # instead raises a BadStatusLine exception. Instead of catching 172s # the exception and assuming all BadStatusLine exceptions are read 172s # timeouts, check for a zero timeout before making the request. 172s if read_timeout == 0: 172s raise ReadTimeoutError( 172s self, url, f"Read timed out. (read timeout={read_timeout})" 172s ) 172s conn.timeout = read_timeout 172s 172s # Receive the response from the server 172s try: 172s response = conn.getresponse() 172s except (BaseSSLError, OSError) as e: 172s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 172s raise 172s 172s # Set properties that are used by the pooling layer. 172s response.retries = retries 172s response._connection = response_conn # type: ignore[attr-defined] 172s response._pool = self # type: ignore[attr-defined] 172s 172s log.debug( 172s '%s://%s:%s "%s %s %s" %s %s', 172s self.scheme, 172s self.host, 172s self.port, 172s method, 172s url, 172s > response.version_string, 172s response.status, 172s response.length_remaining, 172s ) 172s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 172s 172s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 172s ----------------------------- Captured stderr call ----------------------------- 172s 127.0.0.1 - - [18/Jan/2025 03:25:42] "GET / HTTP/1.1" 200 9358 172s ______________________________ test_headers[http] ______________________________ 172s 172s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_headers_http_0') 172s httpbin_both = 172s verify_pool_mgr = 172s 172s def test_headers(tmpdir, httpbin_both, verify_pool_mgr): 172s """Ensure that we can read the headers back""" 172s url = httpbin_both.url 172s with vcr.use_cassette(str(tmpdir.join("headers.yaml"))): 172s > headers = verify_pool_mgr.request("GET", url).headers 172s 172s tests/integration/test_urllib3.py:44: 172s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 172s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 172s return self.request_encode_url( 172s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 172s return self.urlopen(method, url, **extra_kw) 172s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 172s response = conn.urlopen(method, u.request_uri, **kw) 172s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 172s response = self._make_request( 172s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 172s 172s self = 172s conn = 172s method = 'GET', url = '/', body = None, headers = {} 172s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 172s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 172s chunked = False, response_conn = None, preload_content = True 172s decode_content = True, enforce_content_length = True 172s 172s def _make_request( 172s self, 172s conn: BaseHTTPConnection, 172s method: str, 172s url: str, 172s body: _TYPE_BODY | None = None, 172s headers: typing.Mapping[str, str] | None = None, 172s retries: Retry | None = None, 172s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 172s chunked: bool = False, 172s response_conn: BaseHTTPConnection | None = None, 172s preload_content: bool = True, 172s decode_content: bool = True, 172s enforce_content_length: bool = True, 172s ) -> BaseHTTPResponse: 172s """ 172s Perform a request on a given urllib connection object taken from our 172s pool. 172s 172s :param conn: 172s a connection from one of our connection pools 172s 172s :param method: 172s HTTP request method (such as GET, POST, PUT, etc.) 172s 172s :param url: 172s The URL to perform the request on. 172s 172s :param body: 172s Data to send in the request body, either :class:`str`, :class:`bytes`, 172s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 172s 172s :param headers: 172s Dictionary of custom headers to send, such as User-Agent, 172s If-None-Match, etc. If None, pool headers are used. If provided, 172s these headers completely replace any pool-specific headers. 172s 172s :param retries: 172s Configure the number of retries to allow before raising a 172s :class:`~urllib3.exceptions.MaxRetryError` exception. 172s 172s Pass ``None`` to retry until you receive a response. Pass a 172s :class:`~urllib3.util.retry.Retry` object for fine-grained control 172s over different types of retries. 172s Pass an integer number to retry connection errors that many times, 172s but no other types of errors. Pass zero to never retry. 172s 172s If ``False``, then retries are disabled and any exception is raised 172s immediately. Also, instead of raising a MaxRetryError on redirects, 172s the redirect response will be returned. 172s 172s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 172s 172s :param timeout: 172s If specified, overrides the default timeout for this one 172s request. It may be a float (in seconds) or an instance of 172s :class:`urllib3.util.Timeout`. 172s 172s :param chunked: 172s If True, urllib3 will send the body using chunked transfer 172s encoding. Otherwise, urllib3 will send the body using the standard 172s content-length form. Defaults to False. 172s 172s :param response_conn: 172s Set this to ``None`` if you will handle releasing the connection or 172s set the connection to have the response release it. 172s 172s :param preload_content: 172s If True, the response's body will be preloaded during construction. 172s 172s :param decode_content: 172s If True, will attempt to decode the body based on the 172s 'content-encoding' header. 172s 172s :param enforce_content_length: 172s Enforce content length checking. Body returned by server must match 172s value of Content-Length header, if present. Otherwise, raise error. 172s """ 172s self.num_requests += 1 172s 172s timeout_obj = self._get_timeout(timeout) 172s timeout_obj.start_connect() 172s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 172s 172s try: 172s # Trigger any extra validation we need to do. 172s try: 172s self._validate_conn(conn) 172s except (SocketTimeout, BaseSSLError) as e: 172s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 172s raise 172s 172s # _validate_conn() starts the connection to an HTTPS proxy 172s # so we need to wrap errors with 'ProxyError' here too. 172s except ( 172s OSError, 172s NewConnectionError, 172s TimeoutError, 172s BaseSSLError, 172s CertificateError, 172s SSLError, 172s ) as e: 172s new_e: Exception = e 172s if isinstance(e, (BaseSSLError, CertificateError)): 172s new_e = SSLError(e) 172s # If the connection didn't successfully connect to it's proxy 172s # then there 172s if isinstance( 172s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 172s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 172s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 172s raise new_e 172s 172s # conn.request() calls http.client.*.request, not the method in 172s # urllib3.request. It also calls makefile (recv) on the socket. 172s try: 172s conn.request( 172s method, 172s url, 172s body=body, 172s headers=headers, 172s chunked=chunked, 172s preload_content=preload_content, 172s decode_content=decode_content, 172s enforce_content_length=enforce_content_length, 172s ) 172s 172s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 172s # legitimately able to close the connection after sending a valid response. 172s # With this behaviour, the received response is still readable. 172s except BrokenPipeError: 172s pass 172s except OSError as e: 172s # MacOS/Linux 172s # EPROTOTYPE and ECONNRESET are needed on macOS 172s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 172s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 172s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 172s raise 172s 172s # Reset the timeout for the recv() on the socket 172s read_timeout = timeout_obj.read_timeout 172s 172s if not conn.is_closed: 172s # In Python 3 socket.py will catch EAGAIN and return None when you 172s # try and read into the file pointer created by http.client, which 172s # instead raises a BadStatusLine exception. Instead of catching 172s # the exception and assuming all BadStatusLine exceptions are read 172s # timeouts, check for a zero timeout before making the request. 172s if read_timeout == 0: 172s raise ReadTimeoutError( 172s self, url, f"Read timed out. (read timeout={read_timeout})" 172s ) 172s conn.timeout = read_timeout 172s 172s # Receive the response from the server 172s try: 172s response = conn.getresponse() 172s except (BaseSSLError, OSError) as e: 172s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 172s raise 172s 172s # Set properties that are used by the pooling layer. 172s response.retries = retries 172s response._connection = response_conn # type: ignore[attr-defined] 172s response._pool = self # type: ignore[attr-defined] 172s 172s log.debug( 172s '%s://%s:%s "%s %s %s" %s %s', 172s self.scheme, 172s self.host, 172s self.port, 172s method, 172s url, 172s > response.version_string, 172s response.status, 172s response.length_remaining, 172s ) 172s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 172s 172s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 172s ----------------------------- Captured stderr call ----------------------------- 172s 127.0.0.1 - - [18/Jan/2025 03:25:42] "GET / HTTP/1.1" 200 9358 172s _______________________________ test_body[http] ________________________________ 172s 172s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_body_http_0') 172s httpbin_both = 172s verify_pool_mgr = 172s 172s def test_body(tmpdir, httpbin_both, verify_pool_mgr): 172s """Ensure the responses are all identical enough""" 172s url = httpbin_both.url + "/bytes/1024" 172s with vcr.use_cassette(str(tmpdir.join("body.yaml"))): 172s > content = verify_pool_mgr.request("GET", url).data 172s 172s tests/integration/test_urllib3.py:55: 172s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 172s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 172s return self.request_encode_url( 172s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 172s return self.urlopen(method, url, **extra_kw) 172s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 172s response = conn.urlopen(method, u.request_uri, **kw) 172s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 172s response = self._make_request( 172s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 172s 172s self = 172s conn = 172s method = 'GET', url = '/bytes/1024', body = None, headers = {} 172s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 172s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 172s chunked = False, response_conn = None, preload_content = True 172s decode_content = True, enforce_content_length = True 172s 172s def _make_request( 172s self, 172s conn: BaseHTTPConnection, 172s method: str, 172s url: str, 172s body: _TYPE_BODY | None = None, 172s headers: typing.Mapping[str, str] | None = None, 172s retries: Retry | None = None, 172s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 172s chunked: bool = False, 172s response_conn: BaseHTTPConnection | None = None, 172s preload_content: bool = True, 172s decode_content: bool = True, 172s enforce_content_length: bool = True, 172s ) -> BaseHTTPResponse: 172s """ 172s Perform a request on a given urllib connection object taken from our 172s pool. 172s 172s :param conn: 172s a connection from one of our connection pools 172s 172s :param method: 172s HTTP request method (such as GET, POST, PUT, etc.) 172s 172s :param url: 172s The URL to perform the request on. 172s 172s :param body: 172s Data to send in the request body, either :class:`str`, :class:`bytes`, 172s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 172s 172s :param headers: 172s Dictionary of custom headers to send, such as User-Agent, 172s If-None-Match, etc. If None, pool headers are used. If provided, 172s these headers completely replace any pool-specific headers. 172s 172s :param retries: 172s Configure the number of retries to allow before raising a 172s :class:`~urllib3.exceptions.MaxRetryError` exception. 172s 172s Pass ``None`` to retry until you receive a response. Pass a 172s :class:`~urllib3.util.retry.Retry` object for fine-grained control 172s over different types of retries. 172s Pass an integer number to retry connection errors that many times, 172s but no other types of errors. Pass zero to never retry. 172s 172s If ``False``, then retries are disabled and any exception is raised 172s immediately. Also, instead of raising a MaxRetryError on redirects, 172s the redirect response will be returned. 172s 172s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 172s 172s :param timeout: 172s If specified, overrides the default timeout for this one 172s request. It may be a float (in seconds) or an instance of 172s :class:`urllib3.util.Timeout`. 172s 172s :param chunked: 172s If True, urllib3 will send the body using chunked transfer 172s encoding. Otherwise, urllib3 will send the body using the standard 172s content-length form. Defaults to False. 172s 172s :param response_conn: 172s Set this to ``None`` if you will handle releasing the connection or 172s set the connection to have the response release it. 172s 172s :param preload_content: 172s If True, the response's body will be preloaded during construction. 172s 172s :param decode_content: 172s If True, will attempt to decode the body based on the 172s 'content-encoding' header. 172s 172s :param enforce_content_length: 172s Enforce content length checking. Body returned by server must match 172s value of Content-Length header, if present. Otherwise, raise error. 172s """ 172s self.num_requests += 1 172s 172s timeout_obj = self._get_timeout(timeout) 172s timeout_obj.start_connect() 172s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 172s 172s try: 172s # Trigger any extra validation we need to do. 172s try: 172s self._validate_conn(conn) 172s except (SocketTimeout, BaseSSLError) as e: 172s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 172s raise 172s 172s # _validate_conn() starts the connection to an HTTPS proxy 172s # so we need to wrap errors with 'ProxyError' here too. 172s except ( 172s OSError, 172s NewConnectionError, 172s TimeoutError, 172s BaseSSLError, 172s CertificateError, 172s SSLError, 172s ) as e: 172s new_e: Exception = e 172s if isinstance(e, (BaseSSLError, CertificateError)): 172s new_e = SSLError(e) 172s # If the connection didn't successfully connect to it's proxy 172s # then there 172s if isinstance( 172s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 172s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 172s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 172s raise new_e 172s 172s # conn.request() calls http.client.*.request, not the method in 172s # urllib3.request. It also calls makefile (recv) on the socket. 172s try: 172s conn.request( 172s method, 172s url, 172s body=body, 172s headers=headers, 172s chunked=chunked, 172s preload_content=preload_content, 172s decode_content=decode_content, 172s enforce_content_length=enforce_content_length, 172s ) 172s 172s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 172s # legitimately able to close the connection after sending a valid response. 172s # With this behaviour, the received response is still readable. 172s except BrokenPipeError: 172s pass 172s except OSError as e: 172s # MacOS/Linux 172s # EPROTOTYPE and ECONNRESET are needed on macOS 172s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 172s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 172s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 172s raise 172s 172s # Reset the timeout for the recv() on the socket 172s read_timeout = timeout_obj.read_timeout 172s 172s if not conn.is_closed: 172s # In Python 3 socket.py will catch EAGAIN and return None when you 172s # try and read into the file pointer created by http.client, which 172s # instead raises a BadStatusLine exception. Instead of catching 172s # the exception and assuming all BadStatusLine exceptions are read 172s # timeouts, check for a zero timeout before making the request. 172s if read_timeout == 0: 172s raise ReadTimeoutError( 172s self, url, f"Read timed out. (read timeout={read_timeout})" 172s ) 172s conn.timeout = read_timeout 172s 172s # Receive the response from the server 172s try: 172s response = conn.getresponse() 172s except (BaseSSLError, OSError) as e: 172s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 172s raise 172s 172s # Set properties that are used by the pooling layer. 172s response.retries = retries 172s response._connection = response_conn # type: ignore[attr-defined] 172s response._pool = self # type: ignore[attr-defined] 172s 172s log.debug( 172s '%s://%s:%s "%s %s %s" %s %s', 172s self.scheme, 172s self.host, 172s self.port, 172s method, 172s url, 172s > response.version_string, 172s response.status, 172s response.length_remaining, 172s ) 172s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 172s 172s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 172s ----------------------------- Captured stderr call ----------------------------- 172s 127.0.0.1 - - [18/Jan/2025 03:25:42] "GET /bytes/1024 HTTP/1.1" 200 1024 172s _______________________________ test_auth[http] ________________________________ 172s 172s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_auth_http_0') 172s httpbin_both = 172s verify_pool_mgr = 172s 172s def test_auth(tmpdir, httpbin_both, verify_pool_mgr): 172s """Ensure that we can handle basic auth""" 172s auth = ("user", "passwd") 172s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 172s url = httpbin_both.url + "/basic-auth/user/passwd" 172s with vcr.use_cassette(str(tmpdir.join("auth.yaml"))): 172s > one = verify_pool_mgr.request("GET", url, headers=headers) 172s 172s tests/integration/test_urllib3.py:67: 172s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 172s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 172s return self.request_encode_url( 172s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 172s return self.urlopen(method, url, **extra_kw) 172s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 172s response = conn.urlopen(method, u.request_uri, **kw) 172s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 172s response = self._make_request( 172s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 172s 172s self = 172s conn = 172s method = 'GET', url = '/basic-auth/user/passwd', body = None 172s headers = {'authorization': 'Basic dXNlcjpwYXNzd2Q='} 172s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 172s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 172s chunked = False, response_conn = None, preload_content = True 172s decode_content = True, enforce_content_length = True 172s 172s def _make_request( 172s self, 172s conn: BaseHTTPConnection, 172s method: str, 172s url: str, 172s body: _TYPE_BODY | None = None, 172s headers: typing.Mapping[str, str] | None = None, 172s retries: Retry | None = None, 172s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 172s chunked: bool = False, 172s response_conn: BaseHTTPConnection | None = None, 172s preload_content: bool = True, 172s decode_content: bool = True, 172s enforce_content_length: bool = True, 172s ) -> BaseHTTPResponse: 172s """ 172s Perform a request on a given urllib connection object taken from our 172s pool. 172s 172s :param conn: 172s a connection from one of our connection pools 172s 172s :param method: 172s HTTP request method (such as GET, POST, PUT, etc.) 172s 172s :param url: 172s The URL to perform the request on. 172s 172s :param body: 172s Data to send in the request body, either :class:`str`, :class:`bytes`, 172s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 172s 172s :param headers: 172s Dictionary of custom headers to send, such as User-Agent, 172s If-None-Match, etc. If None, pool headers are used. If provided, 172s these headers completely replace any pool-specific headers. 172s 172s :param retries: 172s Configure the number of retries to allow before raising a 172s :class:`~urllib3.exceptions.MaxRetryError` exception. 172s 172s Pass ``None`` to retry until you receive a response. Pass a 172s :class:`~urllib3.util.retry.Retry` object for fine-grained control 172s over different types of retries. 172s Pass an integer number to retry connection errors that many times, 172s but no other types of errors. Pass zero to never retry. 172s 172s If ``False``, then retries are disabled and any exception is raised 172s immediately. Also, instead of raising a MaxRetryError on redirects, 172s the redirect response will be returned. 172s 172s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 172s 172s :param timeout: 172s If specified, overrides the default timeout for this one 172s request. It may be a float (in seconds) or an instance of 172s :class:`urllib3.util.Timeout`. 172s 172s :param chunked: 172s If True, urllib3 will send the body using chunked transfer 172s encoding. Otherwise, urllib3 will send the body using the standard 172s content-length form. Defaults to False. 172s 172s :param response_conn: 172s Set this to ``None`` if you will handle releasing the connection or 172s set the connection to have the response release it. 172s 172s :param preload_content: 172s If True, the response's body will be preloaded during construction. 172s 172s :param decode_content: 172s If True, will attempt to decode the body based on the 172s 'content-encoding' header. 172s 172s :param enforce_content_length: 172s Enforce content length checking. Body returned by server must match 172s value of Content-Length header, if present. Otherwise, raise error. 172s """ 172s self.num_requests += 1 172s 172s timeout_obj = self._get_timeout(timeout) 172s timeout_obj.start_connect() 172s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 172s 172s try: 172s # Trigger any extra validation we need to do. 172s try: 172s self._validate_conn(conn) 172s except (SocketTimeout, BaseSSLError) as e: 172s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 172s raise 172s 172s # _validate_conn() starts the connection to an HTTPS proxy 172s # so we need to wrap errors with 'ProxyError' here too. 172s except ( 172s OSError, 172s NewConnectionError, 172s TimeoutError, 172s BaseSSLError, 172s CertificateError, 172s SSLError, 172s ) as e: 172s new_e: Exception = e 172s if isinstance(e, (BaseSSLError, CertificateError)): 172s new_e = SSLError(e) 172s # If the connection didn't successfully connect to it's proxy 172s # then there 172s if isinstance( 172s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 172s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 172s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 172s raise new_e 172s 172s # conn.request() calls http.client.*.request, not the method in 172s # urllib3.request. It also calls makefile (recv) on the socket. 172s try: 172s conn.request( 172s method, 172s url, 172s body=body, 172s headers=headers, 172s chunked=chunked, 172s preload_content=preload_content, 172s decode_content=decode_content, 172s enforce_content_length=enforce_content_length, 172s ) 172s 172s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 172s # legitimately able to close the connection after sending a valid response. 172s # With this behaviour, the received response is still readable. 172s except BrokenPipeError: 172s pass 172s except OSError as e: 172s # MacOS/Linux 172s # EPROTOTYPE and ECONNRESET are needed on macOS 172s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 172s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 172s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 172s raise 172s 172s # Reset the timeout for the recv() on the socket 172s read_timeout = timeout_obj.read_timeout 172s 172s if not conn.is_closed: 172s # In Python 3 socket.py will catch EAGAIN and return None when you 172s # try and read into the file pointer created by http.client, which 172s # instead raises a BadStatusLine exception. Instead of catching 172s # the exception and assuming all BadStatusLine exceptions are read 172s # timeouts, check for a zero timeout before making the request. 172s if read_timeout == 0: 172s raise ReadTimeoutError( 172s self, url, f"Read timed out. (read timeout={read_timeout})" 172s ) 172s conn.timeout = read_timeout 172s 172s # Receive the response from the server 172s try: 172s response = conn.getresponse() 172s except (BaseSSLError, OSError) as e: 172s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 172s raise 172s 172s # Set properties that are used by the pooling layer. 172s response.retries = retries 172s response._connection = response_conn # type: ignore[attr-defined] 172s response._pool = self # type: ignore[attr-defined] 172s 172s log.debug( 172s '%s://%s:%s "%s %s %s" %s %s', 172s self.scheme, 172s self.host, 172s self.port, 172s method, 172s url, 172s > response.version_string, 172s response.status, 172s response.length_remaining, 172s ) 172s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 172s 172s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 172s ----------------------------- Captured stderr call ----------------------------- 172s 127.0.0.1 - - [18/Jan/2025 03:25:42] "GET /basic-auth/user/passwd HTTP/1.1" 200 46 172s ____________________________ test_auth_failed[http] ____________________________ 172s 172s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_auth_failed_http_0') 172s httpbin_both = 172s verify_pool_mgr = 172s 172s def test_auth_failed(tmpdir, httpbin_both, verify_pool_mgr): 172s """Ensure that we can save failed auth statuses""" 172s auth = ("user", "wrongwrongwrong") 172s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 172s url = httpbin_both.url + "/basic-auth/user/passwd" 172s with vcr.use_cassette(str(tmpdir.join("auth-failed.yaml"))) as cass: 172s # Ensure that this is empty to begin with 172s assert_cassette_empty(cass) 172s > one = verify_pool_mgr.request("GET", url, headers=headers) 172s 172s tests/integration/test_urllib3.py:83: 172s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 172s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 172s return self.request_encode_url( 172s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 172s return self.urlopen(method, url, **extra_kw) 172s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 172s response = conn.urlopen(method, u.request_uri, **kw) 172s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 172s response = self._make_request( 172s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 172s 172s self = 172s conn = 172s method = 'GET', url = '/basic-auth/user/passwd', body = None 172s headers = {'authorization': 'Basic dXNlcjp3cm9uZ3dyb25nd3Jvbmc='} 172s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 172s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 172s chunked = False, response_conn = None, preload_content = True 172s decode_content = True, enforce_content_length = True 172s 172s def _make_request( 172s self, 172s conn: BaseHTTPConnection, 172s method: str, 172s url: str, 172s body: _TYPE_BODY | None = None, 172s headers: typing.Mapping[str, str] | None = None, 172s retries: Retry | None = None, 172s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 172s chunked: bool = False, 172s response_conn: BaseHTTPConnection | None = None, 172s preload_content: bool = True, 172s decode_content: bool = True, 172s enforce_content_length: bool = True, 172s ) -> BaseHTTPResponse: 172s """ 172s Perform a request on a given urllib connection object taken from our 172s pool. 172s 172s :param conn: 172s a connection from one of our connection pools 172s 172s :param method: 172s HTTP request method (such as GET, POST, PUT, etc.) 172s 172s :param url: 172s The URL to perform the request on. 172s 172s :param body: 172s Data to send in the request body, either :class:`str`, :class:`bytes`, 172s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 172s 172s :param headers: 172s Dictionary of custom headers to send, such as User-Agent, 172s If-None-Match, etc. If None, pool headers are used. If provided, 172s these headers completely replace any pool-specific headers. 172s 172s :param retries: 172s Configure the number of retries to allow before raising a 172s :class:`~urllib3.exceptions.MaxRetryError` exception. 172s 172s Pass ``None`` to retry until you receive a response. Pass a 172s :class:`~urllib3.util.retry.Retry` object for fine-grained control 172s over different types of retries. 172s Pass an integer number to retry connection errors that many times, 172s but no other types of errors. Pass zero to never retry. 172s 172s If ``False``, then retries are disabled and any exception is raised 172s immediately. Also, instead of raising a MaxRetryError on redirects, 172s the redirect response will be returned. 172s 172s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 172s 172s :param timeout: 172s If specified, overrides the default timeout for this one 172s request. It may be a float (in seconds) or an instance of 172s :class:`urllib3.util.Timeout`. 172s 172s :param chunked: 172s If True, urllib3 will send the body using chunked transfer 172s encoding. Otherwise, urllib3 will send the body using the standard 172s content-length form. Defaults to False. 172s 172s :param response_conn: 172s Set this to ``None`` if you will handle releasing the connection or 172s set the connection to have the response release it. 172s 172s :param preload_content: 172s If True, the response's body will be preloaded during construction. 172s 172s :param decode_content: 172s If True, will attempt to decode the body based on the 172s 'content-encoding' header. 172s 172s :param enforce_content_length: 172s Enforce content length checking. Body returned by server must match 172s value of Content-Length header, if present. Otherwise, raise error. 172s """ 172s self.num_requests += 1 172s 172s timeout_obj = self._get_timeout(timeout) 172s timeout_obj.start_connect() 172s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 172s 172s try: 172s # Trigger any extra validation we need to do. 172s try: 172s self._validate_conn(conn) 172s except (SocketTimeout, BaseSSLError) as e: 172s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 172s raise 172s 172s # _validate_conn() starts the connection to an HTTPS proxy 172s # so we need to wrap errors with 'ProxyError' here too. 172s except ( 172s OSError, 172s NewConnectionError, 172s TimeoutError, 172s BaseSSLError, 172s CertificateError, 172s SSLError, 172s ) as e: 172s new_e: Exception = e 172s if isinstance(e, (BaseSSLError, CertificateError)): 172s new_e = SSLError(e) 172s # If the connection didn't successfully connect to it's proxy 172s # then there 172s if isinstance( 172s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 172s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 172s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 172s raise new_e 172s 172s # conn.request() calls http.client.*.request, not the method in 172s # urllib3.request. It also calls makefile (recv) on the socket. 172s try: 172s conn.request( 172s method, 172s url, 172s body=body, 172s headers=headers, 172s chunked=chunked, 172s preload_content=preload_content, 172s decode_content=decode_content, 172s enforce_content_length=enforce_content_length, 172s ) 172s 172s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 172s # legitimately able to close the connection after sending a valid response. 172s # With this behaviour, the received response is still readable. 172s except BrokenPipeError: 172s pass 172s except OSError as e: 172s # MacOS/Linux 172s # EPROTOTYPE and ECONNRESET are needed on macOS 172s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 172s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 172s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 172s raise 172s 172s # Reset the timeout for the recv() on the socket 172s read_timeout = timeout_obj.read_timeout 172s 172s if not conn.is_closed: 172s # In Python 3 socket.py will catch EAGAIN and return None when you 172s # try and read into the file pointer created by http.client, which 172s # instead raises a BadStatusLine exception. Instead of catching 172s # the exception and assuming all BadStatusLine exceptions are read 172s # timeouts, check for a zero timeout before making the request. 172s if read_timeout == 0: 172s raise ReadTimeoutError( 172s self, url, f"Read timed out. (read timeout={read_timeout})" 172s ) 172s conn.timeout = read_timeout 172s 172s # Receive the response from the server 172s try: 172s response = conn.getresponse() 172s except (BaseSSLError, OSError) as e: 172s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 172s raise 172s 172s # Set properties that are used by the pooling layer. 172s response.retries = retries 172s response._connection = response_conn # type: ignore[attr-defined] 172s response._pool = self # type: ignore[attr-defined] 172s 172s log.debug( 172s '%s://%s:%s "%s %s %s" %s %s', 172s self.scheme, 172s self.host, 172s self.port, 172s method, 172s url, 172s > response.version_string, 172s response.status, 172s response.length_remaining, 172s ) 172s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 172s 172s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 172s ----------------------------- Captured stderr call ----------------------------- 172s 127.0.0.1 - - [18/Jan/2025 03:25:42] "GET /basic-auth/user/passwd HTTP/1.1" 401 0 172s _______________________________ test_post[http] ________________________________ 172s 172s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_post_http_0') 172s httpbin_both = 172s verify_pool_mgr = 172s 172s def test_post(tmpdir, httpbin_both, verify_pool_mgr): 172s """Ensure that we can post and cache the results""" 172s data = {"key1": "value1", "key2": "value2"} 172s url = httpbin_both.url + "/post" 172s with vcr.use_cassette(str(tmpdir.join("verify_pool_mgr.yaml"))): 172s > req1 = verify_pool_mgr.request("POST", url, data).data 172s 172s tests/integration/test_urllib3.py:94: 172s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 172s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:143: in request 172s return self.request_encode_body( 172s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:278: in request_encode_body 172s return self.urlopen(method, url, **extra_kw) 172s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 172s response = conn.urlopen(method, u.request_uri, **kw) 172s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 172s response = self._make_request( 172s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 172s 172s self = 172s conn = 172s method = 'POST', url = '/post', body = {'key1': 'value1', 'key2': 'value2'} 172s headers = HTTPHeaderDict({}) 172s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 172s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 172s chunked = False, response_conn = None, preload_content = True 172s decode_content = True, enforce_content_length = True 172s 172s def _make_request( 172s self, 172s conn: BaseHTTPConnection, 172s method: str, 172s url: str, 172s body: _TYPE_BODY | None = None, 172s headers: typing.Mapping[str, str] | None = None, 172s retries: Retry | None = None, 172s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 172s chunked: bool = False, 172s response_conn: BaseHTTPConnection | None = None, 172s preload_content: bool = True, 172s decode_content: bool = True, 172s enforce_content_length: bool = True, 172s ) -> BaseHTTPResponse: 172s """ 172s Perform a request on a given urllib connection object taken from our 172s pool. 172s 172s :param conn: 172s a connection from one of our connection pools 172s 172s :param method: 172s HTTP request method (such as GET, POST, PUT, etc.) 172s 172s :param url: 172s The URL to perform the request on. 172s 172s :param body: 172s Data to send in the request body, either :class:`str`, :class:`bytes`, 172s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 172s 172s :param headers: 172s Dictionary of custom headers to send, such as User-Agent, 172s If-None-Match, etc. If None, pool headers are used. If provided, 172s these headers completely replace any pool-specific headers. 172s 172s :param retries: 172s Configure the number of retries to allow before raising a 172s :class:`~urllib3.exceptions.MaxRetryError` exception. 172s 172s Pass ``None`` to retry until you receive a response. Pass a 172s :class:`~urllib3.util.retry.Retry` object for fine-grained control 172s over different types of retries. 172s Pass an integer number to retry connection errors that many times, 172s but no other types of errors. Pass zero to never retry. 172s 172s If ``False``, then retries are disabled and any exception is raised 172s immediately. Also, instead of raising a MaxRetryError on redirects, 172s the redirect response will be returned. 172s 172s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 172s 172s :param timeout: 172s If specified, overrides the default timeout for this one 172s request. It may be a float (in seconds) or an instance of 172s :class:`urllib3.util.Timeout`. 172s 172s :param chunked: 172s If True, urllib3 will send the body using chunked transfer 172s encoding. Otherwise, urllib3 will send the body using the standard 172s content-length form. Defaults to False. 172s 172s :param response_conn: 172s Set this to ``None`` if you will handle releasing the connection or 172s set the connection to have the response release it. 172s 172s :param preload_content: 172s If True, the response's body will be preloaded during construction. 172s 172s :param decode_content: 172s If True, will attempt to decode the body based on the 172s 'content-encoding' header. 172s 172s :param enforce_content_length: 172s Enforce content length checking. Body returned by server must match 172s value of Content-Length header, if present. Otherwise, raise error. 172s """ 172s self.num_requests += 1 172s 172s timeout_obj = self._get_timeout(timeout) 172s timeout_obj.start_connect() 172s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 172s 172s try: 172s # Trigger any extra validation we need to do. 172s try: 172s self._validate_conn(conn) 172s except (SocketTimeout, BaseSSLError) as e: 172s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 172s raise 172s 172s # _validate_conn() starts the connection to an HTTPS proxy 172s # so we need to wrap errors with 'ProxyError' here too. 172s except ( 172s OSError, 172s NewConnectionError, 172s TimeoutError, 172s BaseSSLError, 172s CertificateError, 172s SSLError, 172s ) as e: 172s new_e: Exception = e 172s if isinstance(e, (BaseSSLError, CertificateError)): 172s new_e = SSLError(e) 172s # If the connection didn't successfully connect to it's proxy 172s # then there 172s if isinstance( 172s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 172s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 172s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 172s raise new_e 172s 172s # conn.request() calls http.client.*.request, not the method in 172s # urllib3.request. It also calls makefile (recv) on the socket. 172s try: 172s conn.request( 172s method, 172s url, 172s body=body, 172s headers=headers, 172s chunked=chunked, 172s preload_content=preload_content, 172s decode_content=decode_content, 172s enforce_content_length=enforce_content_length, 172s ) 172s 172s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 172s # legitimately able to close the connection after sending a valid response. 172s # With this behaviour, the received response is still readable. 172s except BrokenPipeError: 172s pass 172s except OSError as e: 172s # MacOS/Linux 172s # EPROTOTYPE and ECONNRESET are needed on macOS 172s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 172s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 172s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 172s raise 172s 172s # Reset the timeout for the recv() on the socket 172s read_timeout = timeout_obj.read_timeout 172s 172s if not conn.is_closed: 172s # In Python 3 socket.py will catch EAGAIN and return None when you 172s # try and read into the file pointer created by http.client, which 172s # instead raises a BadStatusLine exception. Instead of catching 172s # the exception and assuming all BadStatusLine exceptions are read 172s # timeouts, check for a zero timeout before making the request. 172s if read_timeout == 0: 172s raise ReadTimeoutError( 172s self, url, f"Read timed out. (read timeout={read_timeout})" 172s ) 172s conn.timeout = read_timeout 172s 172s # Receive the response from the server 172s try: 172s response = conn.getresponse() 172s except (BaseSSLError, OSError) as e: 172s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 172s raise 172s 172s # Set properties that are used by the pooling layer. 172s response.retries = retries 172s response._connection = response_conn # type: ignore[attr-defined] 172s response._pool = self # type: ignore[attr-defined] 172s 172s log.debug( 172s '%s://%s:%s "%s %s %s" %s %s', 172s self.scheme, 172s self.host, 172s self.port, 172s method, 172s url, 172s > response.version_string, 172s response.status, 172s response.length_remaining, 172s ) 172s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 172s 172s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 172s ----------------------------- Captured stderr call ----------------------------- 172s 127.0.0.1 - - [18/Jan/2025 03:25:42] "POST /post HTTP/1.1" 501 159 172s _______________________________ test_gzip[http] ________________________________ 172s 172s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_gzip_http_0') 172s httpbin_both = 172s verify_pool_mgr = 172s 172s def test_gzip(tmpdir, httpbin_both, verify_pool_mgr): 172s """ 172s Ensure that requests (actually urllib3) is able to automatically decompress 172s the response body 172s """ 172s url = httpbin_both.url + "/gzip" 172s response = verify_pool_mgr.request("GET", url) 172s 172s with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))): 172s > response = verify_pool_mgr.request("GET", url) 172s 172s tests/integration/test_urllib3.py:140: 172s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 172s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 172s return self.request_encode_url( 172s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 172s return self.urlopen(method, url, **extra_kw) 172s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 172s response = conn.urlopen(method, u.request_uri, **kw) 172s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 172s response = self._make_request( 172s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 172s 172s self = 172s conn = 172s method = 'GET', url = '/gzip', body = None, headers = {} 172s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 172s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 172s chunked = False, response_conn = None, preload_content = True 172s decode_content = True, enforce_content_length = True 172s 172s def _make_request( 172s self, 172s conn: BaseHTTPConnection, 172s method: str, 172s url: str, 172s body: _TYPE_BODY | None = None, 172s headers: typing.Mapping[str, str] | None = None, 172s retries: Retry | None = None, 172s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 172s chunked: bool = False, 172s response_conn: BaseHTTPConnection | None = None, 172s preload_content: bool = True, 172s decode_content: bool = True, 172s enforce_content_length: bool = True, 172s ) -> BaseHTTPResponse: 172s """ 172s Perform a request on a given urllib connection object taken from our 172s pool. 172s 172s :param conn: 172s a connection from one of our connection pools 172s 172s :param method: 172s HTTP request method (such as GET, POST, PUT, etc.) 172s 172s :param url: 172s The URL to perform the request on. 172s 172s :param body: 172s Data to send in the request body, either :class:`str`, :class:`bytes`, 172s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 172s 172s :param headers: 172s Dictionary of custom headers to send, such as User-Agent, 172s If-None-Match, etc. If None, pool headers are used. If provided, 172s these headers completely replace any pool-specific headers. 172s 172s :param retries: 172s Configure the number of retries to allow before raising a 172s :class:`~urllib3.exceptions.MaxRetryError` exception. 172s 172s Pass ``None`` to retry until you receive a response. Pass a 172s :class:`~urllib3.util.retry.Retry` object for fine-grained control 172s over different types of retries. 172s Pass an integer number to retry connection errors that many times, 172s but no other types of errors. Pass zero to never retry. 172s 172s If ``False``, then retries are disabled and any exception is raised 172s immediately. Also, instead of raising a MaxRetryError on redirects, 172s the redirect response will be returned. 172s 172s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 172s 172s :param timeout: 172s If specified, overrides the default timeout for this one 172s request. It may be a float (in seconds) or an instance of 172s :class:`urllib3.util.Timeout`. 172s 172s :param chunked: 172s If True, urllib3 will send the body using chunked transfer 172s encoding. Otherwise, urllib3 will send the body using the standard 172s content-length form. Defaults to False. 172s 172s :param response_conn: 172s Set this to ``None`` if you will handle releasing the connection or 172s set the connection to have the response release it. 172s 172s :param preload_content: 172s If True, the response's body will be preloaded during construction. 172s 172s :param decode_content: 172s If True, will attempt to decode the body based on the 172s 'content-encoding' header. 172s 172s :param enforce_content_length: 172s Enforce content length checking. Body returned by server must match 172s value of Content-Length header, if present. Otherwise, raise error. 172s """ 172s self.num_requests += 1 172s 172s timeout_obj = self._get_timeout(timeout) 172s timeout_obj.start_connect() 172s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 172s 172s try: 172s # Trigger any extra validation we need to do. 172s try: 172s self._validate_conn(conn) 172s except (SocketTimeout, BaseSSLError) as e: 172s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 172s raise 172s 172s # _validate_conn() starts the connection to an HTTPS proxy 172s # so we need to wrap errors with 'ProxyError' here too. 172s except ( 172s OSError, 172s NewConnectionError, 172s TimeoutError, 172s BaseSSLError, 172s CertificateError, 172s SSLError, 172s ) as e: 172s new_e: Exception = e 172s if isinstance(e, (BaseSSLError, CertificateError)): 172s new_e = SSLError(e) 172s # If the connection didn't successfully connect to it's proxy 172s # then there 172s if isinstance( 172s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 172s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 172s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 172s raise new_e 172s 172s # conn.request() calls http.client.*.request, not the method in 172s # urllib3.request. It also calls makefile (recv) on the socket. 172s try: 172s conn.request( 172s method, 172s url, 172s body=body, 172s headers=headers, 172s chunked=chunked, 172s preload_content=preload_content, 172s decode_content=decode_content, 172s enforce_content_length=enforce_content_length, 172s ) 172s 172s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 172s # legitimately able to close the connection after sending a valid response. 172s # With this behaviour, the received response is still readable. 172s except BrokenPipeError: 172s pass 172s except OSError as e: 172s # MacOS/Linux 172s # EPROTOTYPE and ECONNRESET are needed on macOS 172s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 172s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 172s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 172s raise 172s 172s # Reset the timeout for the recv() on the socket 172s read_timeout = timeout_obj.read_timeout 172s 172s if not conn.is_closed: 172s # In Python 3 socket.py will catch EAGAIN and return None when you 172s # try and read into the file pointer created by http.client, which 172s # instead raises a BadStatusLine exception. Instead of catching 172s # the exception and assuming all BadStatusLine exceptions are read 172s # timeouts, check for a zero timeout before making the request. 172s if read_timeout == 0: 172s raise ReadTimeoutError( 172s self, url, f"Read timed out. (read timeout={read_timeout})" 172s ) 172s conn.timeout = read_timeout 172s 172s # Receive the response from the server 172s try: 172s response = conn.getresponse() 172s except (BaseSSLError, OSError) as e: 172s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 172s raise 172s 172s # Set properties that are used by the pooling layer. 172s response.retries = retries 172s response._connection = response_conn # type: ignore[attr-defined] 172s response._pool = self # type: ignore[attr-defined] 172s 172s log.debug( 172s '%s://%s:%s "%s %s %s" %s %s', 172s self.scheme, 172s self.host, 172s self.port, 172s method, 172s url, 172s > response.version_string, 172s response.status, 172s response.length_remaining, 172s ) 172s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 172s 172s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 172s ----------------------------- Captured stderr call ----------------------------- 172s 127.0.0.1 - - [18/Jan/2025 03:25:43] "GET /gzip HTTP/1.1" 200 165 172s 127.0.0.1 - - [18/Jan/2025 03:25:43] "GET /gzip HTTP/1.1" 200 165 172s ___________________________ test_status_code[https] ____________________________ 172s 172s httpbin_both = 172s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_status_code_https_0') 172s verify_pool_mgr = 172s 172s def test_status_code(httpbin_both, tmpdir, verify_pool_mgr): 172s """Ensure that we can read the status code""" 172s url = httpbin_both.url 172s with vcr.use_cassette(str(tmpdir.join("atts.yaml"))): 172s > status_code = verify_pool_mgr.request("GET", url).status 172s 172s tests/integration/test_urllib3.py:34: 172s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 172s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 172s return self.request_encode_url( 172s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 172s return self.urlopen(method, url, **extra_kw) 172s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 172s response = conn.urlopen(method, u.request_uri, **kw) 172s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 172s response = self._make_request( 172s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 172s 172s self = 172s conn = 172s method = 'GET', url = '/', body = None, headers = {} 172s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 172s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 172s chunked = False, response_conn = None, preload_content = True 172s decode_content = True, enforce_content_length = True 172s 172s def _make_request( 172s self, 172s conn: BaseHTTPConnection, 172s method: str, 172s url: str, 172s body: _TYPE_BODY | None = None, 172s headers: typing.Mapping[str, str] | None = None, 172s retries: Retry | None = None, 172s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 172s chunked: bool = False, 172s response_conn: BaseHTTPConnection | None = None, 172s preload_content: bool = True, 172s decode_content: bool = True, 172s enforce_content_length: bool = True, 172s ) -> BaseHTTPResponse: 172s """ 172s Perform a request on a given urllib connection object taken from our 172s pool. 172s 172s :param conn: 172s a connection from one of our connection pools 172s 172s :param method: 172s HTTP request method (such as GET, POST, PUT, etc.) 172s 172s :param url: 172s The URL to perform the request on. 172s 172s :param body: 172s Data to send in the request body, either :class:`str`, :class:`bytes`, 172s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 172s 172s :param headers: 172s Dictionary of custom headers to send, such as User-Agent, 172s If-None-Match, etc. If None, pool headers are used. If provided, 172s these headers completely replace any pool-specific headers. 172s 172s :param retries: 172s Configure the number of retries to allow before raising a 172s :class:`~urllib3.exceptions.MaxRetryError` exception. 172s 172s Pass ``None`` to retry until you receive a response. Pass a 172s :class:`~urllib3.util.retry.Retry` object for fine-grained control 172s over different types of retries. 172s Pass an integer number to retry connection errors that many times, 172s but no other types of errors. Pass zero to never retry. 172s 172s If ``False``, then retries are disabled and any exception is raised 172s immediately. Also, instead of raising a MaxRetryError on redirects, 172s the redirect response will be returned. 172s 172s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 172s 172s :param timeout: 172s If specified, overrides the default timeout for this one 172s request. It may be a float (in seconds) or an instance of 172s :class:`urllib3.util.Timeout`. 172s 172s :param chunked: 172s If True, urllib3 will send the body using chunked transfer 172s encoding. Otherwise, urllib3 will send the body using the standard 172s content-length form. Defaults to False. 172s 172s :param response_conn: 172s Set this to ``None`` if you will handle releasing the connection or 172s set the connection to have the response release it. 172s 172s :param preload_content: 172s If True, the response's body will be preloaded during construction. 172s 172s :param decode_content: 172s If True, will attempt to decode the body based on the 172s 'content-encoding' header. 172s 172s :param enforce_content_length: 172s Enforce content length checking. Body returned by server must match 172s value of Content-Length header, if present. Otherwise, raise error. 172s """ 172s self.num_requests += 1 172s 172s timeout_obj = self._get_timeout(timeout) 172s timeout_obj.start_connect() 172s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 172s 172s try: 172s # Trigger any extra validation we need to do. 172s try: 172s self._validate_conn(conn) 172s except (SocketTimeout, BaseSSLError) as e: 172s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 172s raise 172s 172s # _validate_conn() starts the connection to an HTTPS proxy 172s # so we need to wrap errors with 'ProxyError' here too. 172s except ( 172s OSError, 172s NewConnectionError, 172s TimeoutError, 172s BaseSSLError, 172s CertificateError, 172s SSLError, 172s ) as e: 172s new_e: Exception = e 172s if isinstance(e, (BaseSSLError, CertificateError)): 172s new_e = SSLError(e) 172s # If the connection didn't successfully connect to it's proxy 172s # then there 172s if isinstance( 172s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 172s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 172s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 172s raise new_e 172s 172s # conn.request() calls http.client.*.request, not the method in 172s # urllib3.request. It also calls makefile (recv) on the socket. 172s try: 172s conn.request( 172s method, 172s url, 172s body=body, 172s headers=headers, 172s chunked=chunked, 172s preload_content=preload_content, 172s decode_content=decode_content, 172s enforce_content_length=enforce_content_length, 172s ) 172s 172s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 172s # legitimately able to close the connection after sending a valid response. 172s # With this behaviour, the received response is still readable. 172s except BrokenPipeError: 172s pass 172s except OSError as e: 172s # MacOS/Linux 172s # EPROTOTYPE and ECONNRESET are needed on macOS 172s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 172s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 172s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 172s raise 172s 172s # Reset the timeout for the recv() on the socket 172s read_timeout = timeout_obj.read_timeout 172s 172s if not conn.is_closed: 172s # In Python 3 socket.py will catch EAGAIN and return None when you 172s # try and read into the file pointer created by http.client, which 172s # instead raises a BadStatusLine exception. Instead of catching 172s # the exception and assuming all BadStatusLine exceptions are read 172s # timeouts, check for a zero timeout before making the request. 172s if read_timeout == 0: 172s raise ReadTimeoutError( 172s self, url, f"Read timed out. (read timeout={read_timeout})" 172s ) 172s conn.timeout = read_timeout 172s 172s # Receive the response from the server 172s try: 172s response = conn.getresponse() 172s except (BaseSSLError, OSError) as e: 172s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 172s raise 172s 172s # Set properties that are used by the pooling layer. 172s response.retries = retries 172s response._connection = response_conn # type: ignore[attr-defined] 172s response._pool = self # type: ignore[attr-defined] 172s 172s log.debug( 172s '%s://%s:%s "%s %s %s" %s %s', 172s self.scheme, 172s self.host, 172s self.port, 172s method, 172s url, 172s > response.version_string, 172s response.status, 172s response.length_remaining, 172s ) 172s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 172s 172s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 172s ----------------------------- Captured stderr call ----------------------------- 172s 127.0.0.1 - - [18/Jan/2025 03:25:43] "GET / HTTP/1.1" 200 9358 172s _____________________________ test_headers[https] ______________________________ 172s 172s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_headers_https_0') 172s httpbin_both = 172s verify_pool_mgr = 172s 172s def test_headers(tmpdir, httpbin_both, verify_pool_mgr): 172s """Ensure that we can read the headers back""" 172s url = httpbin_both.url 172s with vcr.use_cassette(str(tmpdir.join("headers.yaml"))): 172s > headers = verify_pool_mgr.request("GET", url).headers 172s 172s tests/integration/test_urllib3.py:44: 172s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 172s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 172s return self.request_encode_url( 172s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 172s return self.urlopen(method, url, **extra_kw) 172s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 172s response = conn.urlopen(method, u.request_uri, **kw) 172s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 172s response = self._make_request( 172s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 172s 172s self = 172s conn = 172s method = 'GET', url = '/', body = None, headers = {} 172s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 172s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 172s chunked = False, response_conn = None, preload_content = True 172s decode_content = True, enforce_content_length = True 172s 172s def _make_request( 172s self, 172s conn: BaseHTTPConnection, 172s method: str, 172s url: str, 172s body: _TYPE_BODY | None = None, 172s headers: typing.Mapping[str, str] | None = None, 172s retries: Retry | None = None, 172s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 172s chunked: bool = False, 172s response_conn: BaseHTTPConnection | None = None, 172s preload_content: bool = True, 172s decode_content: bool = True, 172s enforce_content_length: bool = True, 172s ) -> BaseHTTPResponse: 172s """ 172s Perform a request on a given urllib connection object taken from our 172s pool. 172s 172s :param conn: 172s a connection from one of our connection pools 172s 172s :param method: 172s HTTP request method (such as GET, POST, PUT, etc.) 172s 172s :param url: 172s The URL to perform the request on. 172s 172s :param body: 172s Data to send in the request body, either :class:`str`, :class:`bytes`, 172s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 172s 172s :param headers: 172s Dictionary of custom headers to send, such as User-Agent, 172s If-None-Match, etc. If None, pool headers are used. If provided, 172s these headers completely replace any pool-specific headers. 172s 172s :param retries: 172s Configure the number of retries to allow before raising a 172s :class:`~urllib3.exceptions.MaxRetryError` exception. 172s 172s Pass ``None`` to retry until you receive a response. Pass a 172s :class:`~urllib3.util.retry.Retry` object for fine-grained control 172s over different types of retries. 172s Pass an integer number to retry connection errors that many times, 172s but no other types of errors. Pass zero to never retry. 172s 172s If ``False``, then retries are disabled and any exception is raised 172s immediately. Also, instead of raising a MaxRetryError on redirects, 172s the redirect response will be returned. 172s 172s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 172s 172s :param timeout: 172s If specified, overrides the default timeout for this one 172s request. It may be a float (in seconds) or an instance of 172s :class:`urllib3.util.Timeout`. 172s 172s :param chunked: 172s If True, urllib3 will send the body using chunked transfer 172s encoding. Otherwise, urllib3 will send the body using the standard 172s content-length form. Defaults to False. 172s 172s :param response_conn: 172s Set this to ``None`` if you will handle releasing the connection or 172s set the connection to have the response release it. 172s 172s :param preload_content: 172s If True, the response's body will be preloaded during construction. 172s 172s :param decode_content: 172s If True, will attempt to decode the body based on the 172s 'content-encoding' header. 172s 172s :param enforce_content_length: 172s Enforce content length checking. Body returned by server must match 172s value of Content-Length header, if present. Otherwise, raise error. 172s """ 172s self.num_requests += 1 172s 172s timeout_obj = self._get_timeout(timeout) 172s timeout_obj.start_connect() 172s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 172s 172s try: 172s # Trigger any extra validation we need to do. 172s try: 172s self._validate_conn(conn) 172s except (SocketTimeout, BaseSSLError) as e: 172s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 172s raise 172s 172s # _validate_conn() starts the connection to an HTTPS proxy 172s # so we need to wrap errors with 'ProxyError' here too. 172s except ( 172s OSError, 172s NewConnectionError, 172s TimeoutError, 172s BaseSSLError, 172s CertificateError, 172s SSLError, 172s ) as e: 172s new_e: Exception = e 172s if isinstance(e, (BaseSSLError, CertificateError)): 172s new_e = SSLError(e) 172s # If the connection didn't successfully connect to it's proxy 172s # then there 172s if isinstance( 172s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 172s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 172s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 172s raise new_e 172s 172s # conn.request() calls http.client.*.request, not the method in 172s # urllib3.request. It also calls makefile (recv) on the socket. 172s try: 172s conn.request( 172s method, 172s url, 172s body=body, 172s headers=headers, 172s chunked=chunked, 172s preload_content=preload_content, 172s decode_content=decode_content, 172s enforce_content_length=enforce_content_length, 172s ) 172s 172s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 172s # legitimately able to close the connection after sending a valid response. 172s # With this behaviour, the received response is still readable. 172s except BrokenPipeError: 172s pass 172s except OSError as e: 172s # MacOS/Linux 172s # EPROTOTYPE and ECONNRESET are needed on macOS 172s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 172s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 172s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 172s raise 172s 172s # Reset the timeout for the recv() on the socket 172s read_timeout = timeout_obj.read_timeout 172s 172s if not conn.is_closed: 172s # In Python 3 socket.py will catch EAGAIN and return None when you 172s # try and read into the file pointer created by http.client, which 172s # instead raises a BadStatusLine exception. Instead of catching 172s # the exception and assuming all BadStatusLine exceptions are read 172s # timeouts, check for a zero timeout before making the request. 172s if read_timeout == 0: 172s raise ReadTimeoutError( 172s self, url, f"Read timed out. (read timeout={read_timeout})" 172s ) 172s conn.timeout = read_timeout 172s 172s # Receive the response from the server 172s try: 172s response = conn.getresponse() 172s except (BaseSSLError, OSError) as e: 172s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 172s raise 172s 172s # Set properties that are used by the pooling layer. 172s response.retries = retries 172s response._connection = response_conn # type: ignore[attr-defined] 172s response._pool = self # type: ignore[attr-defined] 172s 172s log.debug( 172s '%s://%s:%s "%s %s %s" %s %s', 172s self.scheme, 172s self.host, 172s self.port, 172s method, 172s url, 172s > response.version_string, 172s response.status, 172s response.length_remaining, 172s ) 172s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 172s 172s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 172s ----------------------------- Captured stderr call ----------------------------- 172s 127.0.0.1 - - [18/Jan/2025 03:25:43] "GET / HTTP/1.1" 200 9358 172s _______________________________ test_body[https] _______________________________ 172s 172s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_body_https_0') 172s httpbin_both = 172s verify_pool_mgr = 172s 172s def test_body(tmpdir, httpbin_both, verify_pool_mgr): 172s """Ensure the responses are all identical enough""" 172s url = httpbin_both.url + "/bytes/1024" 172s with vcr.use_cassette(str(tmpdir.join("body.yaml"))): 172s > content = verify_pool_mgr.request("GET", url).data 172s 172s tests/integration/test_urllib3.py:55: 172s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 172s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 172s return self.request_encode_url( 172s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 172s return self.urlopen(method, url, **extra_kw) 172s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 172s response = conn.urlopen(method, u.request_uri, **kw) 172s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 172s response = self._make_request( 172s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 172s 172s self = 172s conn = 172s method = 'GET', url = '/bytes/1024', body = None, headers = {} 172s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 172s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 172s chunked = False, response_conn = None, preload_content = True 172s decode_content = True, enforce_content_length = True 172s 172s def _make_request( 172s self, 172s conn: BaseHTTPConnection, 172s method: str, 172s url: str, 172s body: _TYPE_BODY | None = None, 172s headers: typing.Mapping[str, str] | None = None, 172s retries: Retry | None = None, 172s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 172s chunked: bool = False, 172s response_conn: BaseHTTPConnection | None = None, 172s preload_content: bool = True, 172s decode_content: bool = True, 172s enforce_content_length: bool = True, 172s ) -> BaseHTTPResponse: 172s """ 172s Perform a request on a given urllib connection object taken from our 172s pool. 172s 172s :param conn: 172s a connection from one of our connection pools 172s 172s :param method: 172s HTTP request method (such as GET, POST, PUT, etc.) 172s 172s :param url: 172s The URL to perform the request on. 172s 172s :param body: 172s Data to send in the request body, either :class:`str`, :class:`bytes`, 172s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 172s 172s :param headers: 172s Dictionary of custom headers to send, such as User-Agent, 172s If-None-Match, etc. If None, pool headers are used. If provided, 172s these headers completely replace any pool-specific headers. 172s 172s :param retries: 172s Configure the number of retries to allow before raising a 172s :class:`~urllib3.exceptions.MaxRetryError` exception. 172s 172s Pass ``None`` to retry until you receive a response. Pass a 172s :class:`~urllib3.util.retry.Retry` object for fine-grained control 172s over different types of retries. 172s Pass an integer number to retry connection errors that many times, 172s but no other types of errors. Pass zero to never retry. 172s 172s If ``False``, then retries are disabled and any exception is raised 172s immediately. Also, instead of raising a MaxRetryError on redirects, 172s the redirect response will be returned. 172s 172s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 172s 172s :param timeout: 172s If specified, overrides the default timeout for this one 172s request. It may be a float (in seconds) or an instance of 172s :class:`urllib3.util.Timeout`. 172s 172s :param chunked: 172s If True, urllib3 will send the body using chunked transfer 172s encoding. Otherwise, urllib3 will send the body using the standard 172s content-length form. Defaults to False. 172s 172s :param response_conn: 172s Set this to ``None`` if you will handle releasing the connection or 172s set the connection to have the response release it. 172s 172s :param preload_content: 172s If True, the response's body will be preloaded during construction. 172s 172s :param decode_content: 172s If True, will attempt to decode the body based on the 172s 'content-encoding' header. 172s 172s :param enforce_content_length: 173s Enforce content length checking. Body returned by server must match 173s value of Content-Length header, if present. Otherwise, raise error. 173s """ 173s self.num_requests += 1 173s 173s timeout_obj = self._get_timeout(timeout) 173s timeout_obj.start_connect() 173s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 173s 173s try: 173s # Trigger any extra validation we need to do. 173s try: 173s self._validate_conn(conn) 173s except (SocketTimeout, BaseSSLError) as e: 173s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 173s raise 173s 173s # _validate_conn() starts the connection to an HTTPS proxy 173s # so we need to wrap errors with 'ProxyError' here too. 173s except ( 173s OSError, 173s NewConnectionError, 173s TimeoutError, 173s BaseSSLError, 173s CertificateError, 173s SSLError, 173s ) as e: 173s new_e: Exception = e 173s if isinstance(e, (BaseSSLError, CertificateError)): 173s new_e = SSLError(e) 173s # If the connection didn't successfully connect to it's proxy 173s # then there 173s if isinstance( 173s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 173s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 173s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 173s raise new_e 173s 173s # conn.request() calls http.client.*.request, not the method in 173s # urllib3.request. It also calls makefile (recv) on the socket. 173s try: 173s conn.request( 173s method, 173s url, 173s body=body, 173s headers=headers, 173s chunked=chunked, 173s preload_content=preload_content, 173s decode_content=decode_content, 173s enforce_content_length=enforce_content_length, 173s ) 173s 173s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 173s # legitimately able to close the connection after sending a valid response. 173s # With this behaviour, the received response is still readable. 173s except BrokenPipeError: 173s pass 173s except OSError as e: 173s # MacOS/Linux 173s # EPROTOTYPE and ECONNRESET are needed on macOS 173s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 173s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 173s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 173s raise 173s 173s # Reset the timeout for the recv() on the socket 173s read_timeout = timeout_obj.read_timeout 173s 173s if not conn.is_closed: 173s # In Python 3 socket.py will catch EAGAIN and return None when you 173s # try and read into the file pointer created by http.client, which 173s # instead raises a BadStatusLine exception. Instead of catching 173s # the exception and assuming all BadStatusLine exceptions are read 173s # timeouts, check for a zero timeout before making the request. 173s if read_timeout == 0: 173s raise ReadTimeoutError( 173s self, url, f"Read timed out. (read timeout={read_timeout})" 173s ) 173s conn.timeout = read_timeout 173s 173s # Receive the response from the server 173s try: 173s response = conn.getresponse() 173s except (BaseSSLError, OSError) as e: 173s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 173s raise 173s 173s # Set properties that are used by the pooling layer. 173s response.retries = retries 173s response._connection = response_conn # type: ignore[attr-defined] 173s response._pool = self # type: ignore[attr-defined] 173s 173s log.debug( 173s '%s://%s:%s "%s %s %s" %s %s', 173s self.scheme, 173s self.host, 173s self.port, 173s method, 173s url, 173s > response.version_string, 173s response.status, 173s response.length_remaining, 173s ) 173s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 173s 173s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 173s ----------------------------- Captured stderr call ----------------------------- 173s 127.0.0.1 - - [18/Jan/2025 03:25:43] "GET /bytes/1024 HTTP/1.1" 200 1024 173s _______________________________ test_auth[https] _______________________________ 173s 173s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_auth_https_0') 173s httpbin_both = 173s verify_pool_mgr = 173s 173s def test_auth(tmpdir, httpbin_both, verify_pool_mgr): 173s """Ensure that we can handle basic auth""" 173s auth = ("user", "passwd") 173s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 173s url = httpbin_both.url + "/basic-auth/user/passwd" 173s with vcr.use_cassette(str(tmpdir.join("auth.yaml"))): 173s > one = verify_pool_mgr.request("GET", url, headers=headers) 173s 173s tests/integration/test_urllib3.py:67: 173s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 173s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 173s return self.request_encode_url( 173s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 173s return self.urlopen(method, url, **extra_kw) 173s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 173s response = conn.urlopen(method, u.request_uri, **kw) 173s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 173s response = self._make_request( 173s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 173s 173s self = 173s conn = 173s method = 'GET', url = '/basic-auth/user/passwd', body = None 173s headers = {'authorization': 'Basic dXNlcjpwYXNzd2Q='} 173s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 173s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 173s chunked = False, response_conn = None, preload_content = True 173s decode_content = True, enforce_content_length = True 173s 173s def _make_request( 173s self, 173s conn: BaseHTTPConnection, 173s method: str, 173s url: str, 173s body: _TYPE_BODY | None = None, 173s headers: typing.Mapping[str, str] | None = None, 173s retries: Retry | None = None, 173s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 173s chunked: bool = False, 173s response_conn: BaseHTTPConnection | None = None, 173s preload_content: bool = True, 173s decode_content: bool = True, 173s enforce_content_length: bool = True, 173s ) -> BaseHTTPResponse: 173s """ 173s Perform a request on a given urllib connection object taken from our 173s pool. 173s 173s :param conn: 173s a connection from one of our connection pools 173s 173s :param method: 173s HTTP request method (such as GET, POST, PUT, etc.) 173s 173s :param url: 173s The URL to perform the request on. 173s 173s :param body: 173s Data to send in the request body, either :class:`str`, :class:`bytes`, 173s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 173s 173s :param headers: 173s Dictionary of custom headers to send, such as User-Agent, 173s If-None-Match, etc. If None, pool headers are used. If provided, 173s these headers completely replace any pool-specific headers. 173s 173s :param retries: 173s Configure the number of retries to allow before raising a 173s :class:`~urllib3.exceptions.MaxRetryError` exception. 173s 173s Pass ``None`` to retry until you receive a response. Pass a 173s :class:`~urllib3.util.retry.Retry` object for fine-grained control 173s over different types of retries. 173s Pass an integer number to retry connection errors that many times, 173s but no other types of errors. Pass zero to never retry. 173s 173s If ``False``, then retries are disabled and any exception is raised 173s immediately. Also, instead of raising a MaxRetryError on redirects, 173s the redirect response will be returned. 173s 173s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 173s 173s :param timeout: 173s If specified, overrides the default timeout for this one 173s request. It may be a float (in seconds) or an instance of 173s :class:`urllib3.util.Timeout`. 173s 173s :param chunked: 173s If True, urllib3 will send the body using chunked transfer 173s encoding. Otherwise, urllib3 will send the body using the standard 173s content-length form. Defaults to False. 173s 173s :param response_conn: 173s Set this to ``None`` if you will handle releasing the connection or 173s set the connection to have the response release it. 173s 173s :param preload_content: 173s If True, the response's body will be preloaded during construction. 173s 173s :param decode_content: 173s If True, will attempt to decode the body based on the 173s 'content-encoding' header. 173s 173s :param enforce_content_length: 173s Enforce content length checking. Body returned by server must match 173s value of Content-Length header, if present. Otherwise, raise error. 173s """ 173s self.num_requests += 1 173s 173s timeout_obj = self._get_timeout(timeout) 173s timeout_obj.start_connect() 173s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 173s 173s try: 173s # Trigger any extra validation we need to do. 173s try: 173s self._validate_conn(conn) 173s except (SocketTimeout, BaseSSLError) as e: 173s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 173s raise 173s 173s # _validate_conn() starts the connection to an HTTPS proxy 173s # so we need to wrap errors with 'ProxyError' here too. 173s except ( 173s OSError, 173s NewConnectionError, 173s TimeoutError, 173s BaseSSLError, 173s CertificateError, 173s SSLError, 173s ) as e: 173s new_e: Exception = e 173s if isinstance(e, (BaseSSLError, CertificateError)): 173s new_e = SSLError(e) 173s # If the connection didn't successfully connect to it's proxy 173s # then there 173s if isinstance( 173s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 173s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 173s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 173s raise new_e 173s 173s # conn.request() calls http.client.*.request, not the method in 173s # urllib3.request. It also calls makefile (recv) on the socket. 173s try: 173s conn.request( 173s method, 173s url, 173s body=body, 173s headers=headers, 173s chunked=chunked, 173s preload_content=preload_content, 173s decode_content=decode_content, 173s enforce_content_length=enforce_content_length, 173s ) 173s 173s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 173s # legitimately able to close the connection after sending a valid response. 173s # With this behaviour, the received response is still readable. 173s except BrokenPipeError: 173s pass 173s except OSError as e: 173s # MacOS/Linux 173s # EPROTOTYPE and ECONNRESET are needed on macOS 173s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 173s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 173s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 173s raise 173s 173s # Reset the timeout for the recv() on the socket 173s read_timeout = timeout_obj.read_timeout 173s 173s if not conn.is_closed: 173s # In Python 3 socket.py will catch EAGAIN and return None when you 173s # try and read into the file pointer created by http.client, which 173s # instead raises a BadStatusLine exception. Instead of catching 173s # the exception and assuming all BadStatusLine exceptions are read 173s # timeouts, check for a zero timeout before making the request. 173s if read_timeout == 0: 173s raise ReadTimeoutError( 173s self, url, f"Read timed out. (read timeout={read_timeout})" 173s ) 173s conn.timeout = read_timeout 173s 173s # Receive the response from the server 173s try: 173s response = conn.getresponse() 173s except (BaseSSLError, OSError) as e: 173s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 173s raise 173s 173s # Set properties that are used by the pooling layer. 173s response.retries = retries 173s response._connection = response_conn # type: ignore[attr-defined] 173s response._pool = self # type: ignore[attr-defined] 173s 173s log.debug( 173s '%s://%s:%s "%s %s %s" %s %s', 173s self.scheme, 173s self.host, 173s self.port, 173s method, 173s url, 173s > response.version_string, 173s response.status, 173s response.length_remaining, 173s ) 173s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 173s 173s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 173s ----------------------------- Captured stderr call ----------------------------- 173s 127.0.0.1 - - [18/Jan/2025 03:25:43] "GET /basic-auth/user/passwd HTTP/1.1" 200 46 173s ___________________________ test_auth_failed[https] ____________________________ 173s 173s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_auth_failed_https_0') 173s httpbin_both = 173s verify_pool_mgr = 173s 173s def test_auth_failed(tmpdir, httpbin_both, verify_pool_mgr): 173s """Ensure that we can save failed auth statuses""" 173s auth = ("user", "wrongwrongwrong") 173s headers = urllib3.util.make_headers(basic_auth="{}:{}".format(*auth)) 173s url = httpbin_both.url + "/basic-auth/user/passwd" 173s with vcr.use_cassette(str(tmpdir.join("auth-failed.yaml"))) as cass: 173s # Ensure that this is empty to begin with 173s assert_cassette_empty(cass) 173s > one = verify_pool_mgr.request("GET", url, headers=headers) 173s 173s tests/integration/test_urllib3.py:83: 173s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 173s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 173s return self.request_encode_url( 173s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 173s return self.urlopen(method, url, **extra_kw) 173s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 173s response = conn.urlopen(method, u.request_uri, **kw) 173s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 173s response = self._make_request( 173s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 173s 173s self = 173s conn = 173s method = 'GET', url = '/basic-auth/user/passwd', body = None 173s headers = {'authorization': 'Basic dXNlcjp3cm9uZ3dyb25nd3Jvbmc='} 173s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 173s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 173s chunked = False, response_conn = None, preload_content = True 173s decode_content = True, enforce_content_length = True 173s 173s def _make_request( 173s self, 173s conn: BaseHTTPConnection, 173s method: str, 173s url: str, 173s body: _TYPE_BODY | None = None, 173s headers: typing.Mapping[str, str] | None = None, 173s retries: Retry | None = None, 173s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 173s chunked: bool = False, 173s response_conn: BaseHTTPConnection | None = None, 173s preload_content: bool = True, 173s decode_content: bool = True, 173s enforce_content_length: bool = True, 173s ) -> BaseHTTPResponse: 173s """ 173s Perform a request on a given urllib connection object taken from our 173s pool. 173s 173s :param conn: 173s a connection from one of our connection pools 173s 173s :param method: 173s HTTP request method (such as GET, POST, PUT, etc.) 173s 173s :param url: 173s The URL to perform the request on. 173s 173s :param body: 173s Data to send in the request body, either :class:`str`, :class:`bytes`, 173s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 173s 173s :param headers: 173s Dictionary of custom headers to send, such as User-Agent, 173s If-None-Match, etc. If None, pool headers are used. If provided, 173s these headers completely replace any pool-specific headers. 173s 173s :param retries: 173s Configure the number of retries to allow before raising a 173s :class:`~urllib3.exceptions.MaxRetryError` exception. 173s 173s Pass ``None`` to retry until you receive a response. Pass a 173s :class:`~urllib3.util.retry.Retry` object for fine-grained control 173s over different types of retries. 173s Pass an integer number to retry connection errors that many times, 173s but no other types of errors. Pass zero to never retry. 173s 173s If ``False``, then retries are disabled and any exception is raised 173s immediately. Also, instead of raising a MaxRetryError on redirects, 173s the redirect response will be returned. 173s 173s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 173s 173s :param timeout: 173s If specified, overrides the default timeout for this one 173s request. It may be a float (in seconds) or an instance of 173s :class:`urllib3.util.Timeout`. 173s 173s :param chunked: 173s If True, urllib3 will send the body using chunked transfer 173s encoding. Otherwise, urllib3 will send the body using the standard 173s content-length form. Defaults to False. 173s 173s :param response_conn: 173s Set this to ``None`` if you will handle releasing the connection or 173s set the connection to have the response release it. 173s 173s :param preload_content: 173s If True, the response's body will be preloaded during construction. 173s 173s :param decode_content: 173s If True, will attempt to decode the body based on the 173s 'content-encoding' header. 173s 173s :param enforce_content_length: 173s Enforce content length checking. Body returned by server must match 173s value of Content-Length header, if present. Otherwise, raise error. 173s """ 173s self.num_requests += 1 173s 173s timeout_obj = self._get_timeout(timeout) 173s timeout_obj.start_connect() 173s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 173s 173s try: 173s # Trigger any extra validation we need to do. 173s try: 173s self._validate_conn(conn) 173s except (SocketTimeout, BaseSSLError) as e: 173s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 173s raise 173s 173s # _validate_conn() starts the connection to an HTTPS proxy 173s # so we need to wrap errors with 'ProxyError' here too. 173s except ( 173s OSError, 173s NewConnectionError, 173s TimeoutError, 173s BaseSSLError, 173s CertificateError, 173s SSLError, 173s ) as e: 173s new_e: Exception = e 173s if isinstance(e, (BaseSSLError, CertificateError)): 173s new_e = SSLError(e) 173s # If the connection didn't successfully connect to it's proxy 173s # then there 173s if isinstance( 173s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 173s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 173s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 173s raise new_e 173s 173s # conn.request() calls http.client.*.request, not the method in 173s # urllib3.request. It also calls makefile (recv) on the socket. 173s try: 173s conn.request( 173s method, 173s url, 173s body=body, 173s headers=headers, 173s chunked=chunked, 173s preload_content=preload_content, 173s decode_content=decode_content, 173s enforce_content_length=enforce_content_length, 173s ) 173s 173s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 173s # legitimately able to close the connection after sending a valid response. 173s # With this behaviour, the received response is still readable. 173s except BrokenPipeError: 173s pass 173s except OSError as e: 173s # MacOS/Linux 173s # EPROTOTYPE and ECONNRESET are needed on macOS 173s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 173s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 173s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 173s raise 173s 173s # Reset the timeout for the recv() on the socket 173s read_timeout = timeout_obj.read_timeout 173s 173s if not conn.is_closed: 173s # In Python 3 socket.py will catch EAGAIN and return None when you 173s # try and read into the file pointer created by http.client, which 173s # instead raises a BadStatusLine exception. Instead of catching 173s # the exception and assuming all BadStatusLine exceptions are read 173s # timeouts, check for a zero timeout before making the request. 173s if read_timeout == 0: 173s raise ReadTimeoutError( 173s self, url, f"Read timed out. (read timeout={read_timeout})" 173s ) 173s conn.timeout = read_timeout 173s 173s # Receive the response from the server 173s try: 173s response = conn.getresponse() 173s except (BaseSSLError, OSError) as e: 173s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 173s raise 173s 173s # Set properties that are used by the pooling layer. 173s response.retries = retries 173s response._connection = response_conn # type: ignore[attr-defined] 173s response._pool = self # type: ignore[attr-defined] 173s 173s log.debug( 173s '%s://%s:%s "%s %s %s" %s %s', 173s self.scheme, 173s self.host, 173s self.port, 173s method, 173s url, 173s > response.version_string, 173s response.status, 173s response.length_remaining, 173s ) 173s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 173s 173s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 173s ----------------------------- Captured stderr call ----------------------------- 173s 127.0.0.1 - - [18/Jan/2025 03:25:43] "GET /basic-auth/user/passwd HTTP/1.1" 401 0 173s _______________________________ test_post[https] _______________________________ 173s 173s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_post_https_0') 173s httpbin_both = 173s verify_pool_mgr = 173s 173s def test_post(tmpdir, httpbin_both, verify_pool_mgr): 173s """Ensure that we can post and cache the results""" 173s data = {"key1": "value1", "key2": "value2"} 173s url = httpbin_both.url + "/post" 173s with vcr.use_cassette(str(tmpdir.join("verify_pool_mgr.yaml"))): 173s > req1 = verify_pool_mgr.request("POST", url, data).data 173s 173s tests/integration/test_urllib3.py:94: 173s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 173s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:143: in request 173s return self.request_encode_body( 173s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:278: in request_encode_body 173s return self.urlopen(method, url, **extra_kw) 173s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 173s response = conn.urlopen(method, u.request_uri, **kw) 173s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:871: in urlopen 173s return self.urlopen( 173s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 173s response = self._make_request( 173s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 173s 173s self = 173s conn = 173s method = 'POST', url = '/post', body = {'key1': 'value1', 'key2': 'value2'} 173s headers = HTTPHeaderDict({}) 173s retries = Retry(total=2, connect=None, read=None, redirect=None, status=None) 173s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 173s chunked = False, response_conn = None, preload_content = True 173s decode_content = True, enforce_content_length = True 173s 173s def _make_request( 173s self, 173s conn: BaseHTTPConnection, 173s method: str, 173s url: str, 173s body: _TYPE_BODY | None = None, 173s headers: typing.Mapping[str, str] | None = None, 173s retries: Retry | None = None, 173s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 173s chunked: bool = False, 173s response_conn: BaseHTTPConnection | None = None, 173s preload_content: bool = True, 173s decode_content: bool = True, 173s enforce_content_length: bool = True, 173s ) -> BaseHTTPResponse: 173s """ 173s Perform a request on a given urllib connection object taken from our 173s pool. 173s 173s :param conn: 173s a connection from one of our connection pools 173s 173s :param method: 173s HTTP request method (such as GET, POST, PUT, etc.) 173s 173s :param url: 173s The URL to perform the request on. 173s 173s :param body: 173s Data to send in the request body, either :class:`str`, :class:`bytes`, 173s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 173s 173s :param headers: 173s Dictionary of custom headers to send, such as User-Agent, 173s If-None-Match, etc. If None, pool headers are used. If provided, 173s these headers completely replace any pool-specific headers. 173s 173s :param retries: 173s Configure the number of retries to allow before raising a 173s :class:`~urllib3.exceptions.MaxRetryError` exception. 173s 173s Pass ``None`` to retry until you receive a response. Pass a 173s :class:`~urllib3.util.retry.Retry` object for fine-grained control 173s over different types of retries. 173s Pass an integer number to retry connection errors that many times, 173s but no other types of errors. Pass zero to never retry. 173s 173s If ``False``, then retries are disabled and any exception is raised 173s immediately. Also, instead of raising a MaxRetryError on redirects, 173s the redirect response will be returned. 173s 173s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 173s 173s :param timeout: 173s If specified, overrides the default timeout for this one 173s request. It may be a float (in seconds) or an instance of 173s :class:`urllib3.util.Timeout`. 173s 173s :param chunked: 173s If True, urllib3 will send the body using chunked transfer 173s encoding. Otherwise, urllib3 will send the body using the standard 173s content-length form. Defaults to False. 173s 173s :param response_conn: 173s Set this to ``None`` if you will handle releasing the connection or 173s set the connection to have the response release it. 173s 173s :param preload_content: 173s If True, the response's body will be preloaded during construction. 173s 173s :param decode_content: 173s If True, will attempt to decode the body based on the 173s 'content-encoding' header. 173s 173s :param enforce_content_length: 173s Enforce content length checking. Body returned by server must match 173s value of Content-Length header, if present. Otherwise, raise error. 173s """ 173s self.num_requests += 1 173s 173s timeout_obj = self._get_timeout(timeout) 173s timeout_obj.start_connect() 173s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 173s 173s try: 173s # Trigger any extra validation we need to do. 173s try: 173s self._validate_conn(conn) 173s except (SocketTimeout, BaseSSLError) as e: 173s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 173s raise 173s 173s # _validate_conn() starts the connection to an HTTPS proxy 173s # so we need to wrap errors with 'ProxyError' here too. 173s except ( 173s OSError, 173s NewConnectionError, 173s TimeoutError, 173s BaseSSLError, 173s CertificateError, 173s SSLError, 173s ) as e: 173s new_e: Exception = e 173s if isinstance(e, (BaseSSLError, CertificateError)): 173s new_e = SSLError(e) 173s # If the connection didn't successfully connect to it's proxy 173s # then there 173s if isinstance( 173s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 173s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 173s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 173s raise new_e 173s 173s # conn.request() calls http.client.*.request, not the method in 173s # urllib3.request. It also calls makefile (recv) on the socket. 173s try: 173s conn.request( 173s method, 173s url, 173s body=body, 173s headers=headers, 173s chunked=chunked, 173s preload_content=preload_content, 173s decode_content=decode_content, 173s enforce_content_length=enforce_content_length, 173s ) 173s 173s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 173s # legitimately able to close the connection after sending a valid response. 173s # With this behaviour, the received response is still readable. 173s except BrokenPipeError: 173s pass 173s except OSError as e: 173s # MacOS/Linux 173s # EPROTOTYPE and ECONNRESET are needed on macOS 173s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 173s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 173s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 173s raise 173s 173s # Reset the timeout for the recv() on the socket 173s read_timeout = timeout_obj.read_timeout 173s 173s if not conn.is_closed: 173s # In Python 3 socket.py will catch EAGAIN and return None when you 173s # try and read into the file pointer created by http.client, which 173s # instead raises a BadStatusLine exception. Instead of catching 173s # the exception and assuming all BadStatusLine exceptions are read 173s # timeouts, check for a zero timeout before making the request. 173s if read_timeout == 0: 173s raise ReadTimeoutError( 173s self, url, f"Read timed out. (read timeout={read_timeout})" 173s ) 173s conn.timeout = read_timeout 173s 173s # Receive the response from the server 173s try: 173s response = conn.getresponse() 173s except (BaseSSLError, OSError) as e: 173s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 173s raise 173s 173s # Set properties that are used by the pooling layer. 173s response.retries = retries 173s response._connection = response_conn # type: ignore[attr-defined] 173s response._pool = self # type: ignore[attr-defined] 173s 173s log.debug( 173s '%s://%s:%s "%s %s %s" %s %s', 173s self.scheme, 173s self.host, 173s self.port, 173s method, 173s url, 173s > response.version_string, 173s response.status, 173s response.length_remaining, 173s ) 173s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 173s 173s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 173s ----------------------------- Captured stderr call ----------------------------- 173s 127.0.0.1 - - [18/Jan/2025 03:25:43] "POST /post HTTP/1.1" 501 159 173s 127.0.0.1 - - [18/Jan/2025 03:25:43] "POST /post HTTP/1.1" 501 159 173s ------------------------------ Captured log call ------------------------------- 173s WARNING urllib3.connectionpool:connectionpool.py:868 Retrying (Retry(total=2, connect=None, read=None, redirect=None, status=None)) after connection broken by 'SSLError(5, '[SYS] unknown error (_ssl.c:2418)')': /post 173s _______________________________ test_gzip[https] _______________________________ 173s 173s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_gzip_https_0') 173s httpbin_both = 173s verify_pool_mgr = 173s 173s def test_gzip(tmpdir, httpbin_both, verify_pool_mgr): 173s """ 173s Ensure that requests (actually urllib3) is able to automatically decompress 173s the response body 173s """ 173s url = httpbin_both.url + "/gzip" 173s response = verify_pool_mgr.request("GET", url) 173s 173s with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))): 173s > response = verify_pool_mgr.request("GET", url) 173s 173s tests/integration/test_urllib3.py:140: 173s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 173s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 173s return self.request_encode_url( 173s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 173s return self.urlopen(method, url, **extra_kw) 173s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 173s response = conn.urlopen(method, u.request_uri, **kw) 173s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 173s response = self._make_request( 173s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 173s 173s self = 173s conn = 173s method = 'GET', url = '/gzip', body = None, headers = {} 173s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 173s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 173s chunked = False, response_conn = None, preload_content = True 173s decode_content = True, enforce_content_length = True 173s 173s def _make_request( 173s self, 173s conn: BaseHTTPConnection, 173s method: str, 173s url: str, 173s body: _TYPE_BODY | None = None, 173s headers: typing.Mapping[str, str] | None = None, 173s retries: Retry | None = None, 173s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 173s chunked: bool = False, 173s response_conn: BaseHTTPConnection | None = None, 173s preload_content: bool = True, 173s decode_content: bool = True, 173s enforce_content_length: bool = True, 173s ) -> BaseHTTPResponse: 173s """ 173s Perform a request on a given urllib connection object taken from our 173s pool. 173s 173s :param conn: 173s a connection from one of our connection pools 173s 173s :param method: 173s HTTP request method (such as GET, POST, PUT, etc.) 173s 173s :param url: 173s The URL to perform the request on. 173s 173s :param body: 173s Data to send in the request body, either :class:`str`, :class:`bytes`, 173s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 173s 173s :param headers: 173s Dictionary of custom headers to send, such as User-Agent, 173s If-None-Match, etc. If None, pool headers are used. If provided, 173s these headers completely replace any pool-specific headers. 173s 173s :param retries: 173s Configure the number of retries to allow before raising a 173s :class:`~urllib3.exceptions.MaxRetryError` exception. 173s 173s Pass ``None`` to retry until you receive a response. Pass a 173s :class:`~urllib3.util.retry.Retry` object for fine-grained control 173s over different types of retries. 173s Pass an integer number to retry connection errors that many times, 173s but no other types of errors. Pass zero to never retry. 173s 173s If ``False``, then retries are disabled and any exception is raised 173s immediately. Also, instead of raising a MaxRetryError on redirects, 173s the redirect response will be returned. 173s 173s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 173s 173s :param timeout: 173s If specified, overrides the default timeout for this one 173s request. It may be a float (in seconds) or an instance of 173s :class:`urllib3.util.Timeout`. 173s 173s :param chunked: 173s If True, urllib3 will send the body using chunked transfer 173s encoding. Otherwise, urllib3 will send the body using the standard 173s content-length form. Defaults to False. 173s 173s :param response_conn: 173s Set this to ``None`` if you will handle releasing the connection or 173s set the connection to have the response release it. 173s 173s :param preload_content: 173s If True, the response's body will be preloaded during construction. 173s 173s :param decode_content: 173s If True, will attempt to decode the body based on the 173s 'content-encoding' header. 173s 173s :param enforce_content_length: 173s Enforce content length checking. Body returned by server must match 173s value of Content-Length header, if present. Otherwise, raise error. 173s """ 173s self.num_requests += 1 173s 173s timeout_obj = self._get_timeout(timeout) 173s timeout_obj.start_connect() 173s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 173s 173s try: 173s # Trigger any extra validation we need to do. 173s try: 173s self._validate_conn(conn) 173s except (SocketTimeout, BaseSSLError) as e: 173s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 173s raise 173s 173s # _validate_conn() starts the connection to an HTTPS proxy 173s # so we need to wrap errors with 'ProxyError' here too. 173s except ( 173s OSError, 173s NewConnectionError, 173s TimeoutError, 173s BaseSSLError, 173s CertificateError, 173s SSLError, 173s ) as e: 173s new_e: Exception = e 173s if isinstance(e, (BaseSSLError, CertificateError)): 173s new_e = SSLError(e) 173s # If the connection didn't successfully connect to it's proxy 173s # then there 173s if isinstance( 173s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 173s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 173s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 173s raise new_e 173s 173s # conn.request() calls http.client.*.request, not the method in 173s # urllib3.request. It also calls makefile (recv) on the socket. 173s try: 173s conn.request( 173s method, 173s url, 173s body=body, 173s headers=headers, 173s chunked=chunked, 173s preload_content=preload_content, 173s decode_content=decode_content, 173s enforce_content_length=enforce_content_length, 173s ) 173s 173s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 173s # legitimately able to close the connection after sending a valid response. 173s # With this behaviour, the received response is still readable. 173s except BrokenPipeError: 173s pass 173s except OSError as e: 173s # MacOS/Linux 173s # EPROTOTYPE and ECONNRESET are needed on macOS 173s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 173s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 173s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 173s raise 173s 173s # Reset the timeout for the recv() on the socket 173s read_timeout = timeout_obj.read_timeout 173s 173s if not conn.is_closed: 173s # In Python 3 socket.py will catch EAGAIN and return None when you 173s # try and read into the file pointer created by http.client, which 173s # instead raises a BadStatusLine exception. Instead of catching 173s # the exception and assuming all BadStatusLine exceptions are read 173s # timeouts, check for a zero timeout before making the request. 173s if read_timeout == 0: 173s raise ReadTimeoutError( 173s self, url, f"Read timed out. (read timeout={read_timeout})" 173s ) 173s conn.timeout = read_timeout 173s 173s # Receive the response from the server 173s try: 173s response = conn.getresponse() 173s except (BaseSSLError, OSError) as e: 173s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 173s raise 173s 173s # Set properties that are used by the pooling layer. 173s response.retries = retries 173s response._connection = response_conn # type: ignore[attr-defined] 173s response._pool = self # type: ignore[attr-defined] 173s 173s log.debug( 173s '%s://%s:%s "%s %s %s" %s %s', 173s self.scheme, 173s self.host, 173s self.port, 173s method, 173s url, 173s > response.version_string, 173s response.status, 173s response.length_remaining, 173s ) 173s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 173s 173s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 173s ----------------------------- Captured stderr call ----------------------------- 173s 127.0.0.1 - - [18/Jan/2025 03:25:43] "GET /gzip HTTP/1.1" 200 165 173s 127.0.0.1 - - [18/Jan/2025 03:25:43] "GET /gzip HTTP/1.1" 200 165 173s ________________________________ test_use_proxy ________________________________ 173s 173s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_use_proxy0') 173s httpbin = 173s proxy_server = 'http://0.0.0.0:58685' 173s 173s def test_use_proxy(tmpdir, httpbin, proxy_server): 173s """Ensure that it works with a proxy.""" 173s with vcr.use_cassette(str(tmpdir.join("proxy.yaml"))): 173s > response = requests.get(httpbin.url, proxies={"http": proxy_server}) 173s 173s tests/integration/test_proxy.py:53: 173s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 173s /usr/lib/python3/dist-packages/requests/api.py:73: in get 173s return request("get", url, params=params, **kwargs) 173s /usr/lib/python3/dist-packages/requests/api.py:59: in request 173s return session.request(method=method, url=url, **kwargs) 173s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 173s resp = self.send(prep, **send_kwargs) 173s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 173s r = adapter.send(request, **kwargs) 173s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 173s resp = conn.urlopen( 173s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 173s response = self._make_request( 173s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 173s 173s self = 173s conn = 173s method = 'GET', url = 'http://127.0.0.1:36663/', body = None 173s headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 173s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 173s timeout = Timeout(connect=None, read=None, total=None), chunked = False 173s response_conn = 173s preload_content = False, decode_content = False, enforce_content_length = True 173s 173s def _make_request( 173s self, 173s conn: BaseHTTPConnection, 173s method: str, 173s url: str, 173s body: _TYPE_BODY | None = None, 173s headers: typing.Mapping[str, str] | None = None, 173s retries: Retry | None = None, 173s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 173s chunked: bool = False, 173s response_conn: BaseHTTPConnection | None = None, 173s preload_content: bool = True, 173s decode_content: bool = True, 173s enforce_content_length: bool = True, 173s ) -> BaseHTTPResponse: 173s """ 173s Perform a request on a given urllib connection object taken from our 173s pool. 173s 173s :param conn: 173s a connection from one of our connection pools 173s 173s :param method: 173s HTTP request method (such as GET, POST, PUT, etc.) 173s 173s :param url: 173s The URL to perform the request on. 173s 173s :param body: 173s Data to send in the request body, either :class:`str`, :class:`bytes`, 173s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 173s 173s :param headers: 173s Dictionary of custom headers to send, such as User-Agent, 173s If-None-Match, etc. If None, pool headers are used. If provided, 173s these headers completely replace any pool-specific headers. 173s 173s :param retries: 173s Configure the number of retries to allow before raising a 173s :class:`~urllib3.exceptions.MaxRetryError` exception. 173s 173s Pass ``None`` to retry until you receive a response. Pass a 173s :class:`~urllib3.util.retry.Retry` object for fine-grained control 173s over different types of retries. 173s Pass an integer number to retry connection errors that many times, 173s but no other types of errors. Pass zero to never retry. 173s 173s If ``False``, then retries are disabled and any exception is raised 173s immediately. Also, instead of raising a MaxRetryError on redirects, 173s the redirect response will be returned. 173s 173s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 173s 173s :param timeout: 173s If specified, overrides the default timeout for this one 173s request. It may be a float (in seconds) or an instance of 173s :class:`urllib3.util.Timeout`. 173s 173s :param chunked: 173s If True, urllib3 will send the body using chunked transfer 173s encoding. Otherwise, urllib3 will send the body using the standard 173s content-length form. Defaults to False. 173s 173s :param response_conn: 173s Set this to ``None`` if you will handle releasing the connection or 173s set the connection to have the response release it. 173s 173s :param preload_content: 173s If True, the response's body will be preloaded during construction. 173s 173s :param decode_content: 173s If True, will attempt to decode the body based on the 173s 'content-encoding' header. 173s 173s :param enforce_content_length: 173s Enforce content length checking. Body returned by server must match 173s value of Content-Length header, if present. Otherwise, raise error. 173s """ 173s self.num_requests += 1 173s 173s timeout_obj = self._get_timeout(timeout) 173s timeout_obj.start_connect() 173s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 173s 173s try: 173s # Trigger any extra validation we need to do. 173s try: 173s self._validate_conn(conn) 173s except (SocketTimeout, BaseSSLError) as e: 173s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 173s raise 173s 173s # _validate_conn() starts the connection to an HTTPS proxy 173s # so we need to wrap errors with 'ProxyError' here too. 173s except ( 173s OSError, 173s NewConnectionError, 173s TimeoutError, 173s BaseSSLError, 173s CertificateError, 173s SSLError, 173s ) as e: 173s new_e: Exception = e 173s if isinstance(e, (BaseSSLError, CertificateError)): 173s new_e = SSLError(e) 173s # If the connection didn't successfully connect to it's proxy 173s # then there 173s if isinstance( 173s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 173s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 173s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 173s raise new_e 173s 173s # conn.request() calls http.client.*.request, not the method in 173s # urllib3.request. It also calls makefile (recv) on the socket. 173s try: 173s conn.request( 173s method, 173s url, 173s body=body, 173s headers=headers, 173s chunked=chunked, 173s preload_content=preload_content, 173s decode_content=decode_content, 173s enforce_content_length=enforce_content_length, 173s ) 173s 173s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 173s # legitimately able to close the connection after sending a valid response. 173s # With this behaviour, the received response is still readable. 173s except BrokenPipeError: 173s pass 173s except OSError as e: 173s # MacOS/Linux 173s # EPROTOTYPE and ECONNRESET are needed on macOS 173s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 173s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 173s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 173s raise 173s 173s # Reset the timeout for the recv() on the socket 173s read_timeout = timeout_obj.read_timeout 173s 173s if not conn.is_closed: 173s # In Python 3 socket.py will catch EAGAIN and return None when you 173s # try and read into the file pointer created by http.client, which 173s # instead raises a BadStatusLine exception. Instead of catching 173s # the exception and assuming all BadStatusLine exceptions are read 173s # timeouts, check for a zero timeout before making the request. 173s if read_timeout == 0: 173s raise ReadTimeoutError( 173s self, url, f"Read timed out. (read timeout={read_timeout})" 173s ) 173s conn.timeout = read_timeout 173s 173s # Receive the response from the server 173s try: 173s response = conn.getresponse() 173s except (BaseSSLError, OSError) as e: 173s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 173s raise 173s 173s # Set properties that are used by the pooling layer. 173s response.retries = retries 173s response._connection = response_conn # type: ignore[attr-defined] 173s response._pool = self # type: ignore[attr-defined] 173s 173s log.debug( 173s '%s://%s:%s "%s %s %s" %s %s', 173s self.scheme, 173s self.host, 173s self.port, 173s method, 173s url, 173s > response.version_string, 173s response.status, 173s response.length_remaining, 173s ) 173s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 173s 173s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 173s ----------------------------- Captured stderr call ----------------------------- 173s 127.0.0.1 - - [18/Jan/2025 03:25:44] "GET / HTTP/1.1" 200 9358 173s 127.0.0.1 - - [18/Jan/2025 03:25:44] "GET http://127.0.0.1:36663/ HTTP/1.1" 200 - 173s ______________________________ test_cross_scheme _______________________________ 173s 173s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_cross_scheme2') 173s httpbin = 173s httpbin_secure = 173s verify_pool_mgr = 173s 173s def test_cross_scheme(tmpdir, httpbin, httpbin_secure, verify_pool_mgr): 173s """Ensure that requests between schemes are treated separately""" 173s # First fetch a url under http, and then again under https and then 173s # ensure that we haven't served anything out of cache, and we have two 173s # requests / response pairs in the cassette 173s with vcr.use_cassette(str(tmpdir.join("cross_scheme.yaml"))) as cass: 173s > verify_pool_mgr.request("GET", httpbin_secure.url) 173s 173s tests/integration/test_urllib3.py:125: 173s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 173s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 173s return self.request_encode_url( 173s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 173s return self.urlopen(method, url, **extra_kw) 173s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 173s response = conn.urlopen(method, u.request_uri, **kw) 173s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 173s response = self._make_request( 173s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 173s 173s self = 173s conn = 173s method = 'GET', url = '/', body = None, headers = {} 173s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 173s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 173s chunked = False, response_conn = None, preload_content = True 173s decode_content = True, enforce_content_length = True 173s 173s def _make_request( 173s self, 173s conn: BaseHTTPConnection, 173s method: str, 173s url: str, 173s body: _TYPE_BODY | None = None, 173s headers: typing.Mapping[str, str] | None = None, 173s retries: Retry | None = None, 173s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 173s chunked: bool = False, 173s response_conn: BaseHTTPConnection | None = None, 173s preload_content: bool = True, 173s decode_content: bool = True, 173s enforce_content_length: bool = True, 173s ) -> BaseHTTPResponse: 173s """ 173s Perform a request on a given urllib connection object taken from our 173s pool. 173s 173s :param conn: 173s a connection from one of our connection pools 173s 173s :param method: 173s HTTP request method (such as GET, POST, PUT, etc.) 173s 173s :param url: 173s The URL to perform the request on. 173s 173s :param body: 173s Data to send in the request body, either :class:`str`, :class:`bytes`, 173s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 173s 173s :param headers: 173s Dictionary of custom headers to send, such as User-Agent, 173s If-None-Match, etc. If None, pool headers are used. If provided, 173s these headers completely replace any pool-specific headers. 173s 173s :param retries: 173s Configure the number of retries to allow before raising a 173s :class:`~urllib3.exceptions.MaxRetryError` exception. 173s 173s Pass ``None`` to retry until you receive a response. Pass a 173s :class:`~urllib3.util.retry.Retry` object for fine-grained control 173s over different types of retries. 173s Pass an integer number to retry connection errors that many times, 173s but no other types of errors. Pass zero to never retry. 173s 173s If ``False``, then retries are disabled and any exception is raised 173s immediately. Also, instead of raising a MaxRetryError on redirects, 173s the redirect response will be returned. 173s 173s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 173s 173s :param timeout: 173s If specified, overrides the default timeout for this one 173s request. It may be a float (in seconds) or an instance of 173s :class:`urllib3.util.Timeout`. 173s 173s :param chunked: 173s If True, urllib3 will send the body using chunked transfer 173s encoding. Otherwise, urllib3 will send the body using the standard 173s content-length form. Defaults to False. 173s 173s :param response_conn: 173s Set this to ``None`` if you will handle releasing the connection or 173s set the connection to have the response release it. 173s 173s :param preload_content: 173s If True, the response's body will be preloaded during construction. 173s 173s :param decode_content: 173s If True, will attempt to decode the body based on the 173s 'content-encoding' header. 173s 173s :param enforce_content_length: 173s Enforce content length checking. Body returned by server must match 173s value of Content-Length header, if present. Otherwise, raise error. 173s """ 173s self.num_requests += 1 173s 173s timeout_obj = self._get_timeout(timeout) 173s timeout_obj.start_connect() 173s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 173s 173s try: 173s # Trigger any extra validation we need to do. 173s try: 173s self._validate_conn(conn) 173s except (SocketTimeout, BaseSSLError) as e: 173s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 173s raise 173s 173s # _validate_conn() starts the connection to an HTTPS proxy 173s # so we need to wrap errors with 'ProxyError' here too. 173s except ( 173s OSError, 173s NewConnectionError, 173s TimeoutError, 173s BaseSSLError, 173s CertificateError, 173s SSLError, 173s ) as e: 173s new_e: Exception = e 173s if isinstance(e, (BaseSSLError, CertificateError)): 173s new_e = SSLError(e) 173s # If the connection didn't successfully connect to it's proxy 173s # then there 173s if isinstance( 173s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 173s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 173s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 173s raise new_e 173s 173s # conn.request() calls http.client.*.request, not the method in 173s # urllib3.request. It also calls makefile (recv) on the socket. 173s try: 173s conn.request( 173s method, 173s url, 173s body=body, 173s headers=headers, 173s chunked=chunked, 173s preload_content=preload_content, 173s decode_content=decode_content, 173s enforce_content_length=enforce_content_length, 173s ) 173s 173s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 173s # legitimately able to close the connection after sending a valid response. 173s # With this behaviour, the received response is still readable. 173s except BrokenPipeError: 173s pass 173s except OSError as e: 173s # MacOS/Linux 173s # EPROTOTYPE and ECONNRESET are needed on macOS 173s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 173s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 173s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 173s raise 173s 173s # Reset the timeout for the recv() on the socket 173s read_timeout = timeout_obj.read_timeout 173s 173s if not conn.is_closed: 173s # In Python 3 socket.py will catch EAGAIN and return None when you 173s # try and read into the file pointer created by http.client, which 173s # instead raises a BadStatusLine exception. Instead of catching 173s # the exception and assuming all BadStatusLine exceptions are read 173s # timeouts, check for a zero timeout before making the request. 173s if read_timeout == 0: 173s raise ReadTimeoutError( 173s self, url, f"Read timed out. (read timeout={read_timeout})" 173s ) 173s conn.timeout = read_timeout 173s 173s # Receive the response from the server 173s try: 173s response = conn.getresponse() 173s except (BaseSSLError, OSError) as e: 173s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 173s raise 173s 173s # Set properties that are used by the pooling layer. 173s response.retries = retries 173s response._connection = response_conn # type: ignore[attr-defined] 173s response._pool = self # type: ignore[attr-defined] 173s 173s log.debug( 173s '%s://%s:%s "%s %s %s" %s %s', 173s self.scheme, 173s self.host, 173s self.port, 173s method, 173s url, 173s > response.version_string, 173s response.status, 173s response.length_remaining, 173s ) 173s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 173s 173s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 173s ----------------------------- Captured stderr call ----------------------------- 173s 127.0.0.1 - - [18/Jan/2025 03:25:44] "GET / HTTP/1.1" 200 9358 173s ___________________ test_https_with_cert_validation_disabled ___________________ 173s 173s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_https_with_cert_validatio0') 173s httpbin_secure = 173s pool_mgr = 173s 173s def test_https_with_cert_validation_disabled(tmpdir, httpbin_secure, pool_mgr): 173s with vcr.use_cassette(str(tmpdir.join("cert_validation_disabled.yaml"))): 173s > pool_mgr.request("GET", httpbin_secure.url) 173s 173s tests/integration/test_urllib3.py:149: 173s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 173s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:135: in request 173s return self.request_encode_url( 173s /usr/lib/python3/dist-packages/urllib3/_request_methods.py:182: in request_encode_url 173s return self.urlopen(method, url, **extra_kw) 173s /usr/lib/python3/dist-packages/urllib3/poolmanager.py:443: in urlopen 173s response = conn.urlopen(method, u.request_uri, **kw) 173s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 173s response = self._make_request( 173s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 173s 173s self = 173s conn = 173s method = 'GET', url = '/', body = None, headers = {} 173s retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) 173s timeout = Timeout(connect=<_TYPE_DEFAULT.token: -1>, read=<_TYPE_DEFAULT.token: -1>, total=None) 173s chunked = False, response_conn = None, preload_content = True 173s decode_content = True, enforce_content_length = True 173s 173s def _make_request( 173s self, 173s conn: BaseHTTPConnection, 173s method: str, 173s url: str, 173s body: _TYPE_BODY | None = None, 173s headers: typing.Mapping[str, str] | None = None, 173s retries: Retry | None = None, 173s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 173s chunked: bool = False, 173s response_conn: BaseHTTPConnection | None = None, 173s preload_content: bool = True, 173s decode_content: bool = True, 173s enforce_content_length: bool = True, 173s ) -> BaseHTTPResponse: 173s """ 173s Perform a request on a given urllib connection object taken from our 173s pool. 173s 173s :param conn: 173s a connection from one of our connection pools 173s 173s :param method: 173s HTTP request method (such as GET, POST, PUT, etc.) 173s 173s :param url: 173s The URL to perform the request on. 173s 173s :param body: 173s Data to send in the request body, either :class:`str`, :class:`bytes`, 173s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 173s 173s :param headers: 173s Dictionary of custom headers to send, such as User-Agent, 173s If-None-Match, etc. If None, pool headers are used. If provided, 173s these headers completely replace any pool-specific headers. 173s 173s :param retries: 173s Configure the number of retries to allow before raising a 173s :class:`~urllib3.exceptions.MaxRetryError` exception. 173s 173s Pass ``None`` to retry until you receive a response. Pass a 173s :class:`~urllib3.util.retry.Retry` object for fine-grained control 173s over different types of retries. 173s Pass an integer number to retry connection errors that many times, 173s but no other types of errors. Pass zero to never retry. 173s 173s If ``False``, then retries are disabled and any exception is raised 173s immediately. Also, instead of raising a MaxRetryError on redirects, 173s the redirect response will be returned. 173s 173s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 173s 173s :param timeout: 173s If specified, overrides the default timeout for this one 173s request. It may be a float (in seconds) or an instance of 173s :class:`urllib3.util.Timeout`. 173s 173s :param chunked: 173s If True, urllib3 will send the body using chunked transfer 173s encoding. Otherwise, urllib3 will send the body using the standard 173s content-length form. Defaults to False. 173s 173s :param response_conn: 173s Set this to ``None`` if you will handle releasing the connection or 173s set the connection to have the response release it. 173s 173s :param preload_content: 173s If True, the response's body will be preloaded during construction. 173s 173s :param decode_content: 173s If True, will attempt to decode the body based on the 173s 'content-encoding' header. 173s 173s :param enforce_content_length: 173s Enforce content length checking. Body returned by server must match 173s value of Content-Length header, if present. Otherwise, raise error. 173s """ 173s self.num_requests += 1 173s 173s timeout_obj = self._get_timeout(timeout) 173s timeout_obj.start_connect() 173s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 173s 173s try: 173s # Trigger any extra validation we need to do. 173s try: 173s self._validate_conn(conn) 173s except (SocketTimeout, BaseSSLError) as e: 173s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 173s raise 173s 173s # _validate_conn() starts the connection to an HTTPS proxy 173s # so we need to wrap errors with 'ProxyError' here too. 173s except ( 173s OSError, 173s NewConnectionError, 173s TimeoutError, 173s BaseSSLError, 173s CertificateError, 173s SSLError, 173s ) as e: 173s new_e: Exception = e 173s if isinstance(e, (BaseSSLError, CertificateError)): 173s new_e = SSLError(e) 173s # If the connection didn't successfully connect to it's proxy 173s # then there 173s if isinstance( 173s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 173s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 173s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 173s raise new_e 173s 173s # conn.request() calls http.client.*.request, not the method in 173s # urllib3.request. It also calls makefile (recv) on the socket. 173s try: 173s conn.request( 173s method, 173s url, 173s body=body, 173s headers=headers, 173s chunked=chunked, 173s preload_content=preload_content, 173s decode_content=decode_content, 173s enforce_content_length=enforce_content_length, 173s ) 173s 173s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 173s E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build; python3.12 -m pytest --ignore tests/integration/test_aiohttp.py --ignore tests/integration/test_tornado.py --ignore tests/integration/test_requests.py -m "not online" -k "not test_basic_json_use and not test_load_cassette_with_custom_persister" 173s pybuild-autopkgtest: error: pybuild --autopkgtest --test-pytest -i python{version} -p "3.13 3.12" returned exit code 13 173s make: *** [/tmp/5fepkfNW8i/run:4: pybuild-autopkgtest] Error 25 173s pybuild-autopkgtest: error: /tmp/5fepkfNW8i/run pybuild-autopkgtest returned exit code 2 173s # legitimately able to close the connection after sending a valid response. 173s # With this behaviour, the received response is still readable. 173s except BrokenPipeError: 173s pass 173s except OSError as e: 173s # MacOS/Linux 173s # EPROTOTYPE and ECONNRESET are needed on macOS 173s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 173s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 173s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 173s raise 173s 173s # Reset the timeout for the recv() on the socket 173s read_timeout = timeout_obj.read_timeout 173s 173s if not conn.is_closed: 173s # In Python 3 socket.py will catch EAGAIN and return None when you 173s # try and read into the file pointer created by http.client, which 173s # instead raises a BadStatusLine exception. Instead of catching 173s # the exception and assuming all BadStatusLine exceptions are read 173s # timeouts, check for a zero timeout before making the request. 173s if read_timeout == 0: 173s raise ReadTimeoutError( 173s self, url, f"Read timed out. (read timeout={read_timeout})" 173s ) 173s conn.timeout = read_timeout 173s 173s # Receive the response from the server 173s try: 173s response = conn.getresponse() 173s except (BaseSSLError, OSError) as e: 173s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 173s raise 173s 173s # Set properties that are used by the pooling layer. 173s response.retries = retries 173s response._connection = response_conn # type: ignore[attr-defined] 173s response._pool = self # type: ignore[attr-defined] 173s 173s log.debug( 173s '%s://%s:%s "%s %s %s" %s %s', 173s self.scheme, 173s self.host, 173s self.port, 173s method, 173s url, 173s > response.version_string, 173s response.status, 173s response.length_remaining, 173s ) 173s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 173s 173s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 173s ----------------------------- Captured stderr call ----------------------------- 173s 127.0.0.1 - - [18/Jan/2025 03:25:44] "GET / HTTP/1.1" 200 9358 173s _____________________________ test_domain_redirect _____________________________ 173s 173s def test_domain_redirect(): 173s """Ensure that redirects across domains are considered unique""" 173s # In this example, seomoz.org redirects to moz.com, and if those 173s # requests are considered identical, then we'll be stuck in a redirect 173s # loop. 173s url = "http://seomoz.org/" 173s with vcr.use_cassette("tests/fixtures/wild/domain_redirect.yaml") as cass: 173s > requests.get(url, headers={"User-Agent": "vcrpy-test"}) 173s 173s tests/integration/test_wild.py:20: 173s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 173s /usr/lib/python3/dist-packages/requests/api.py:73: in get 173s return request("get", url, params=params, **kwargs) 173s /usr/lib/python3/dist-packages/requests/api.py:59: in request 173s return session.request(method=method, url=url, **kwargs) 173s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 173s resp = self.send(prep, **send_kwargs) 173s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 173s r = adapter.send(request, **kwargs) 173s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 173s resp = conn.urlopen( 173s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 173s response = self._make_request( 173s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 173s 173s self = 173s conn = 173s method = 'GET', url = '/', body = None 173s headers = {'User-Agent': 'vcrpy-test', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 173s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 173s timeout = Timeout(connect=None, read=None, total=None), chunked = False 173s response_conn = 173s preload_content = False, decode_content = False, enforce_content_length = True 173s 173s def _make_request( 173s self, 173s conn: BaseHTTPConnection, 173s method: str, 173s url: str, 173s body: _TYPE_BODY | None = None, 173s headers: typing.Mapping[str, str] | None = None, 173s retries: Retry | None = None, 173s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 173s chunked: bool = False, 173s response_conn: BaseHTTPConnection | None = None, 173s preload_content: bool = True, 173s decode_content: bool = True, 173s enforce_content_length: bool = True, 173s ) -> BaseHTTPResponse: 173s """ 173s Perform a request on a given urllib connection object taken from our 173s pool. 173s 173s :param conn: 173s a connection from one of our connection pools 173s 173s :param method: 173s HTTP request method (such as GET, POST, PUT, etc.) 173s 173s :param url: 173s The URL to perform the request on. 173s 173s :param body: 173s Data to send in the request body, either :class:`str`, :class:`bytes`, 173s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 173s 173s :param headers: 173s Dictionary of custom headers to send, such as User-Agent, 173s If-None-Match, etc. If None, pool headers are used. If provided, 173s these headers completely replace any pool-specific headers. 173s 173s :param retries: 173s Configure the number of retries to allow before raising a 173s :class:`~urllib3.exceptions.MaxRetryError` exception. 173s 173s Pass ``None`` to retry until you receive a response. Pass a 173s :class:`~urllib3.util.retry.Retry` object for fine-grained control 173s over different types of retries. 173s Pass an integer number to retry connection errors that many times, 173s but no other types of errors. Pass zero to never retry. 173s 173s If ``False``, then retries are disabled and any exception is raised 173s immediately. Also, instead of raising a MaxRetryError on redirects, 173s the redirect response will be returned. 173s 173s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 173s 173s :param timeout: 173s If specified, overrides the default timeout for this one 173s request. It may be a float (in seconds) or an instance of 173s :class:`urllib3.util.Timeout`. 173s 173s :param chunked: 173s If True, urllib3 will send the body using chunked transfer 173s encoding. Otherwise, urllib3 will send the body using the standard 173s content-length form. Defaults to False. 173s 173s :param response_conn: 173s Set this to ``None`` if you will handle releasing the connection or 173s set the connection to have the response release it. 173s 173s :param preload_content: 173s If True, the response's body will be preloaded during construction. 173s 173s :param decode_content: 173s If True, will attempt to decode the body based on the 173s 'content-encoding' header. 173s 173s :param enforce_content_length: 173s Enforce content length checking. Body returned by server must match 173s value of Content-Length header, if present. Otherwise, raise error. 173s """ 173s self.num_requests += 1 173s 173s timeout_obj = self._get_timeout(timeout) 173s timeout_obj.start_connect() 173s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 173s 173s try: 173s # Trigger any extra validation we need to do. 173s try: 173s self._validate_conn(conn) 173s except (SocketTimeout, BaseSSLError) as e: 173s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 173s raise 173s 173s # _validate_conn() starts the connection to an HTTPS proxy 173s # so we need to wrap errors with 'ProxyError' here too. 173s except ( 173s OSError, 173s NewConnectionError, 173s TimeoutError, 173s BaseSSLError, 173s CertificateError, 173s SSLError, 173s ) as e: 173s new_e: Exception = e 173s if isinstance(e, (BaseSSLError, CertificateError)): 173s new_e = SSLError(e) 173s # If the connection didn't successfully connect to it's proxy 173s # then there 173s if isinstance( 173s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 173s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 173s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 173s raise new_e 173s 173s # conn.request() calls http.client.*.request, not the method in 173s # urllib3.request. It also calls makefile (recv) on the socket. 173s try: 173s conn.request( 173s method, 173s url, 173s body=body, 173s headers=headers, 173s chunked=chunked, 173s preload_content=preload_content, 173s decode_content=decode_content, 173s enforce_content_length=enforce_content_length, 173s ) 173s 173s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 173s # legitimately able to close the connection after sending a valid response. 173s # With this behaviour, the received response is still readable. 173s except BrokenPipeError: 173s pass 173s except OSError as e: 173s # MacOS/Linux 173s # EPROTOTYPE and ECONNRESET are needed on macOS 173s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 173s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 173s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 173s raise 173s 173s # Reset the timeout for the recv() on the socket 173s read_timeout = timeout_obj.read_timeout 173s 173s if not conn.is_closed: 173s # In Python 3 socket.py will catch EAGAIN and return None when you 173s # try and read into the file pointer created by http.client, which 173s # instead raises a BadStatusLine exception. Instead of catching 173s # the exception and assuming all BadStatusLine exceptions are read 173s # timeouts, check for a zero timeout before making the request. 173s if read_timeout == 0: 173s raise ReadTimeoutError( 173s self, url, f"Read timed out. (read timeout={read_timeout})" 173s ) 173s conn.timeout = read_timeout 173s 173s # Receive the response from the server 173s try: 173s response = conn.getresponse() 173s except (BaseSSLError, OSError) as e: 173s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 173s raise 173s 173s # Set properties that are used by the pooling layer. 173s response.retries = retries 173s response._connection = response_conn # type: ignore[attr-defined] 173s response._pool = self # type: ignore[attr-defined] 173s 173s log.debug( 173s '%s://%s:%s "%s %s %s" %s %s', 173s self.scheme, 173s self.host, 173s self.port, 173s method, 173s url, 173s > response.version_string, 173s response.status, 173s response.length_remaining, 173s ) 173s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 173s 173s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 173s _________________________________ test_cookies _________________________________ 173s 173s tmpdir = local('/tmp/pytest-of-ubuntu/pytest-1/test_cookies0') 173s httpbin = 173s 173s def test_cookies(tmpdir, httpbin): 173s testfile = str(tmpdir.join("cookies.yml")) 173s with vcr.use_cassette(testfile): 173s with requests.Session() as s: 173s > s.get(httpbin.url + "/cookies/set?k1=v1&k2=v2") 173s 173s tests/integration/test_wild.py:67: 173s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 173s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 173s return self.request("GET", url, **kwargs) 173s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 173s resp = self.send(prep, **send_kwargs) 173s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 173s r = adapter.send(request, **kwargs) 173s /usr/lib/python3/dist-packages/requests/adapters.py:667: in send 173s resp = conn.urlopen( 173s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen 173s response = self._make_request( 173s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 173s 173s self = 173s conn = 173s method = 'GET', url = '/cookies/set?k1=v1&k2=v2', body = None 173s headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} 173s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 173s timeout = Timeout(connect=None, read=None, total=None), chunked = False 173s response_conn = 173s preload_content = False, decode_content = False, enforce_content_length = True 173s 173s def _make_request( 173s self, 173s conn: BaseHTTPConnection, 173s method: str, 173s url: str, 173s body: _TYPE_BODY | None = None, 173s headers: typing.Mapping[str, str] | None = None, 173s retries: Retry | None = None, 173s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 173s chunked: bool = False, 173s response_conn: BaseHTTPConnection | None = None, 173s preload_content: bool = True, 173s decode_content: bool = True, 173s enforce_content_length: bool = True, 173s ) -> BaseHTTPResponse: 173s """ 173s Perform a request on a given urllib connection object taken from our 173s pool. 173s 173s :param conn: 173s a connection from one of our connection pools 173s 173s :param method: 173s HTTP request method (such as GET, POST, PUT, etc.) 173s 173s :param url: 173s The URL to perform the request on. 173s 173s :param body: 173s Data to send in the request body, either :class:`str`, :class:`bytes`, 173s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 173s 173s :param headers: 173s Dictionary of custom headers to send, such as User-Agent, 173s If-None-Match, etc. If None, pool headers are used. If provided, 173s these headers completely replace any pool-specific headers. 173s 173s :param retries: 173s Configure the number of retries to allow before raising a 173s :class:`~urllib3.exceptions.MaxRetryError` exception. 173s 173s Pass ``None`` to retry until you receive a response. Pass a 173s :class:`~urllib3.util.retry.Retry` object for fine-grained control 173s over different types of retries. 173s Pass an integer number to retry connection errors that many times, 173s but no other types of errors. Pass zero to never retry. 173s 173s If ``False``, then retries are disabled and any exception is raised 173s immediately. Also, instead of raising a MaxRetryError on redirects, 173s the redirect response will be returned. 173s 173s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 173s 173s :param timeout: 173s If specified, overrides the default timeout for this one 173s request. It may be a float (in seconds) or an instance of 173s :class:`urllib3.util.Timeout`. 173s 173s :param chunked: 173s If True, urllib3 will send the body using chunked transfer 173s encoding. Otherwise, urllib3 will send the body using the standard 173s content-length form. Defaults to False. 173s 173s :param response_conn: 173s Set this to ``None`` if you will handle releasing the connection or 173s set the connection to have the response release it. 173s 173s :param preload_content: 173s If True, the response's body will be preloaded during construction. 173s 173s :param decode_content: 173s If True, will attempt to decode the body based on the 173s 'content-encoding' header. 173s 173s :param enforce_content_length: 173s Enforce content length checking. Body returned by server must match 173s value of Content-Length header, if present. Otherwise, raise error. 173s """ 173s self.num_requests += 1 173s 173s timeout_obj = self._get_timeout(timeout) 173s timeout_obj.start_connect() 173s conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 173s 173s try: 173s # Trigger any extra validation we need to do. 173s try: 173s self._validate_conn(conn) 173s except (SocketTimeout, BaseSSLError) as e: 173s self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 173s raise 173s 173s # _validate_conn() starts the connection to an HTTPS proxy 173s # so we need to wrap errors with 'ProxyError' here too. 173s except ( 173s OSError, 173s NewConnectionError, 173s TimeoutError, 173s BaseSSLError, 173s CertificateError, 173s SSLError, 173s ) as e: 173s new_e: Exception = e 173s if isinstance(e, (BaseSSLError, CertificateError)): 173s new_e = SSLError(e) 173s # If the connection didn't successfully connect to it's proxy 173s # then there 173s if isinstance( 173s new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 173s ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 173s new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 173s raise new_e 173s 173s # conn.request() calls http.client.*.request, not the method in 173s # urllib3.request. It also calls makefile (recv) on the socket. 173s try: 173s conn.request( 173s method, 173s url, 173s body=body, 173s headers=headers, 173s chunked=chunked, 173s preload_content=preload_content, 173s decode_content=decode_content, 173s enforce_content_length=enforce_content_length, 173s ) 173s 173s # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 173s # legitimately able to close the connection after sending a valid response. 173s # With this behaviour, the received response is still readable. 173s except BrokenPipeError: 173s pass 173s except OSError as e: 173s # MacOS/Linux 173s # EPROTOTYPE and ECONNRESET are needed on macOS 173s # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 173s # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 173s if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 173s raise 173s 173s # Reset the timeout for the recv() on the socket 173s read_timeout = timeout_obj.read_timeout 173s 173s if not conn.is_closed: 173s # In Python 3 socket.py will catch EAGAIN and return None when you 173s # try and read into the file pointer created by http.client, which 173s # instead raises a BadStatusLine exception. Instead of catching 173s # the exception and assuming all BadStatusLine exceptions are read 173s # timeouts, check for a zero timeout before making the request. 173s if read_timeout == 0: 173s raise ReadTimeoutError( 173s self, url, f"Read timed out. (read timeout={read_timeout})" 173s ) 173s conn.timeout = read_timeout 173s 173s # Receive the response from the server 173s try: 173s response = conn.getresponse() 173s except (BaseSSLError, OSError) as e: 173s self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 173s raise 173s 173s # Set properties that are used by the pooling layer. 173s response.retries = retries 173s response._connection = response_conn # type: ignore[attr-defined] 173s response._pool = self # type: ignore[attr-defined] 173s 173s log.debug( 173s '%s://%s:%s "%s %s %s" %s %s', 173s self.scheme, 173s self.host, 173s self.port, 173s method, 173s url, 173s > response.version_string, 173s response.status, 173s response.length_remaining, 173s ) 173s E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' 173s 173s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError 173s ----------------------------- Captured stderr call ----------------------------- 173s 127.0.0.1 - - [18/Jan/2025 03:25:44] "GET /cookies/set?k1=v1&k2=v2 HTTP/1.1" 302 203 173s =============================== warnings summary =============================== 173s tests/integration/test_config.py:10 173s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_config.py:10: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 173s @pytest.mark.online 173s 173s tests/integration/test_config.py:24 173s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_config.py:24: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 173s @pytest.mark.online 173s 173s tests/integration/test_config.py:34 173s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_config.py:34: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 173s @pytest.mark.online 173s 173s tests/integration/test_config.py:47 173s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_config.py:47: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 173s @pytest.mark.online 173s 173s tests/integration/test_config.py:69 173s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_config.py:69: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 173s @pytest.mark.online 173s 173s tests/integration/test_disksaver.py:14 173s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_disksaver.py:14: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 173s @pytest.mark.online 173s 173s tests/integration/test_disksaver.py:35 173s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_disksaver.py:35: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 173s @pytest.mark.online 173s 173s tests/integration/test_httplib2.py:60 173s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_httplib2.py:60: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 173s @pytest.mark.online 173s 173s tests/integration/test_register_matcher.py:16 173s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_register_matcher.py:16: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 173s @pytest.mark.online 173s 173s tests/integration/test_register_matcher.py:32 173s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_register_matcher.py:32: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 173s @pytest.mark.online 173s 173s tests/integration/test_urllib2.py:60 173s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_urllib2.py:60: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 173s @mark.online 173s 173s tests/integration/test_urllib3.py:102 173s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_urllib3.py:102: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 173s @pytest.mark.online 173s 173s tests/integration/test_wild.py:55 173s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_wild.py:55: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 173s @pytest.mark.online 173s 173s tests/integration/test_wild.py:74 173s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/integration/test_wild.py:74: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 173s @pytest.mark.online 173s 173s tests/unit/test_stubs.py:20 173s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/unit/test_stubs.py:20: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 173s @mark.online 173s 173s tests/unit/test_unittest.py:131 173s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/unit/test_unittest.py:131: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 173s @pytest.mark.online 173s 173s tests/unit/test_unittest.py:166 173s /tmp/autopkgtest.MqadJQ/autopkgtest_tmp/build/tests/unit/test_unittest.py:166: PytestUnknownMarkWarning: Unknown pytest.mark.online - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 173s @pytest.mark.online 173s 173s tests/integration/test_wild.py::test_xmlrpclib 173s /usr/lib/python3.12/multiprocessing/popen_fork.py:66: DeprecationWarning: This process (pid=3492) is multi-threaded, use of fork() may lead to deadlocks in the child. 173s self.pid = os.fork() 173s 173s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 173s =========================== short test summary info ============================ 173s FAILED tests/integration/test_urllib3.py::test_status_code[http] - AttributeE... 173s FAILED tests/integration/test_urllib3.py::test_headers[http] - AttributeError... 173s FAILED tests/integration/test_urllib3.py::test_body[http] - AttributeError: '... 173s FAILED tests/integration/test_urllib3.py::test_auth[http] - AttributeError: '... 173s FAILED tests/integration/test_urllib3.py::test_auth_failed[http] - AttributeE... 173s FAILED tests/integration/test_urllib3.py::test_post[http] - AttributeError: '... 173s FAILED tests/integration/test_urllib3.py::test_gzip[http] - AttributeError: '... 173s FAILED tests/integration/test_urllib3.py::test_status_code[https] - Attribute... 173s FAILED tests/integration/test_urllib3.py::test_headers[https] - AttributeErro... 173s FAILED tests/integration/test_urllib3.py::test_body[https] - AttributeError: ... 173s FAILED tests/integration/test_urllib3.py::test_auth[https] - AttributeError: ... 173s FAILED tests/integration/test_urllib3.py::test_auth_failed[https] - Attribute... 173s FAILED tests/integration/test_urllib3.py::test_post[https] - AttributeError: ... 173s FAILED tests/integration/test_urllib3.py::test_gzip[https] - AttributeError: ... 173s FAILED tests/integration/test_proxy.py::test_use_proxy - AttributeError: 'VCR... 173s FAILED tests/integration/test_urllib3.py::test_cross_scheme - AttributeError:... 173s FAILED tests/integration/test_urllib3.py::test_https_with_cert_validation_disabled 173s FAILED tests/integration/test_wild.py::test_domain_redirect - AttributeError:... 173s FAILED tests/integration/test_wild.py::test_cookies - AttributeError: 'VCRHTT... 173s ==== 19 failed, 265 passed, 3 skipped, 19 deselected, 18 warnings in 5.85s ===== 173s autopkgtest [03:25:47]: test pybuild-autopkgtest: -----------------------] 174s pybuild-autopkgtest FAIL non-zero exit status 25 174s autopkgtest [03:25:48]: test pybuild-autopkgtest: - - - - - - - - - - results - - - - - - - - - - 174s autopkgtest [03:25:48]: @@@@@@@@@@@@@@@@@@@@ summary 174s pybuild-autopkgtest FAIL non-zero exit status 25 179s nova [W] Using flock in prodstack6-ppc64el 179s Creating nova instance adt-plucky-ppc64el-vcr.py-20250118-032254-juju-7f2275-prod-proposed-migration-environment-20-31956513-6df4-4cc1-9d52-02a17818ef4e from image adt/ubuntu-plucky-ppc64el-server-20250118.img (UUID ae4ff9d6-d4c8-4087-90be-b2adeb15025d)... 179s nova [W] Timed out waiting for 95509478-c464-4ef5-b566-cd476b37028e to get deleted.